Log in

No account? Create an account

Software Publisher/CEO of OSource Inc./ Radio Show Host/Political Pundit


Journal Info

My Website



December 23rd, 2008


Trust me' just doesn't fly
When Congress rushed to give unprecedented new powers to law enforcement in the weeks after the 9/11 attacks, debate was limited and the vote was overwhelming: 357-66 in the House and 98-1 in the Senate. As portions of the "USA Patriot Act" law come up for renewal, that's unlikely to happen again, fortunately.

After three years of Justice Department stonewalling about use of the law and numerous reports of abuses, an unusual coalition is forming to demand changes in its most troublesome sections.

Conservative Republican Sen. Larry Craig of Idaho, 2004 Democratic presidential nominee John Kerry, the American Civil Liberties Union and the American Conservative Union don't work together often. But they're just a few of the strange political bedfellows calling for a rollback in provisions that threaten civil liberties and privacy rights.

That span of opposition should be a signal to those who've been trying for more than two years to ram through legislation making the law permanent. But many of them still murmur about only "technical changes" while demanding additional investigative tools that raise further questions.

The conspiracy indictment disclosed Tuesday of three men already awaiting trial in England is a reminder that terrorism is a real threat, and most of the law is non-controversial. Portions of it removed barriers to the exchange of information among law enforcement and foreign intelligence agencies. But other sections are far less benign:

• The law allows secret searches of any home or business by federal agents, with no deadline to notify the owners or occupants that a search has taken place. This has been used against innocent citizens.

• It authorizes collection of personal information from libraries, businesses and medical providers even if there is no evidence of any connection with terrorism. And those ordered to supply the information are barred from letting anyone know that Big Brother is engaging in such activities.

• The law defines domestic terrorism so broadly it could be applied to completely unrelated acts, even peaceful protests.

The Justice Department's response is essentially, "Trust me." Appearing before Congress last week, Attorney General Alberto Gonzales insisted that the administration has been careful with the law and has made limited use of the most controversial provisions. He said the authority to investigate libraries hasn't been used — but he wants to keep it.

The record shows government can't be trusted to protect privacy rights, and the law has been abused.

After months of denials that turned out to be false, the Justice Department reversed itself and acknowledged April 5 that it had secretly searched the home of an Oregon lawyer who was wrongly accused of being a perpetrator of last year's train bombings in Madrid. He was never told of the search.

A Muslim student in Idaho was prosecuted because he posted Internet links to objectionable materials, even though identical links were available on the Web sites of a major news outlet and the government's own expert witness in the case.

And while the granting of unprecedented law-enforcement powers was justified as essential to the special needs of the war on terrorism, the act's provisions have been used in criminal investigations as mundane as a Las Vegas bribery probe.

The need to renew portions of the Patriot Act due to expire at the end of the year gives Congress the opportunity to take a careful look at the entire law — and this time show as much respect for the rights of ordinary citizens as for the demands of law enforcement.

(no subject)


NSA Spying

The U.S. Government, with assistance from major telecommunications carriers including AT&T, has been engaging in a massive program of illegal dragnet surveillance of domestic communications and communications records of millions of ordinary Americans since at least 2001.

In 2005, after the New York Times broke the story of the surveillance program, the President publicly admitted one portion of it — warrantless surveillance of Americans believed to be communicating with people connected with terrorism suspects. Senior Bush Administration officials later confirmed that the President's authorization went beyond the surveillance of terrorists and conceded that the program did not comply with the Foreign Intelligence Surveillance Act (FISA). The President, invoking a theory of limitless executive power to disregard the mandates of Congress, has reauthorized this warrantless surveillance more than thirty times, including after the Department of Justice found the program to violate criminal laws, and has indicated that he intends to continue doing so.

Shortly after the initial revelations, a whistleblower named Mark Klein came forward with evidence describing the specific AT&T facilities, including one on Folsom Street in San Francisco [PDF], where the handoff of customer communications is occurring. Mr. Klein's evidence confirms the many newspaper reports that the government is engaging in dragnet surveillance of the domestic communications of millions of ordinary Americans.

EFF is fighting this surveillance on several fronts. In Hepting v. AT&T, EFF filed the first case against a telecom for violating its customers' privacy. In response to EFF's success in the case, and the filing of dozens of other cases across the country that attempted to hold law breaking telecoms accountable, the Bush Administration and the telecommunications carriers sought retroactive immunity for the carriers for their participation in the illegal surveillance. On July 9, 2008, Congress passed the FISA Amendments Act of 2008, which was intended to force the dismissal of Hepting v. AT&T and the other telecom lawsuits. EFF is working to challenge this law and hold telcoms accountable for their illegal behavior.

In addition, EFF is representing victims of the illegal surveillance program in Jewel v. NSA, a lawsuit filed in September 2008 against the government seeking to stop the warrantless wiretapping and hold the government officials behind the program accountable.

EFF is not alone in this fight. There are multiple cases challenging various parts of the illegal surveillance against both the telecoms and the government. This page collects information on EFF's cases as well as cases brought by individuals, the American Civil Liberties Union of Northern California and of Illinois, the Center for Constitutional Rights, and others.

October 25th, 2008

My Alter Ego

David Chamberz's Showcase

September 30th, 2008

(no subject)

Microsoft's plans for post-Windows OS revealed - Software Development Times On The Web
Microsoft's plans for post-Windows OS revealed
By David Worthington

July 29, 2008 — Microsoft is incubating a componentized non-Windows operating system known as Midori, which is being architected from the ground up to tackle challenges that Redmond has determined cannot be met by simply evolving its existing technology.

SD Times has viewed internal Microsoft documents that outline Midori’s proposed design, which is Internet-centric and predicated on the prevalence of connected systems.

Midori is an offshoot of Microsoft Research’s Singularity operating system, the tools and libraries of which are completely managed code. Midori is designed to run directly on native hardware (x86, x64 and ARM), be hosted on the Windows Hyper-V hypervisor, or even be hosted by a Windows process.

According to published reports, Eric Rudder, senior vice president for technical strategy at Microsoft and an alumnus of Bill Gates' technical staff, is heading up the effort. Rudder served as senior vice president of Microsoft’s Servers and Tools group until 2005. A Microsoft spokesperson refused comment.

“That sounds possible—I’ve heard rumors to the effect that he [Rudder] had an OS project in place,” said Rob Helm, director of research at Directions on Microsoft. He noted that it is quite possible that the project is just exploratory, but conceivably a step above what Microsoft Research does.

One of Microsoft’s goals is to provide options for Midori applications to co-exist with and interoperate with existing Windows applications, as well as to provide a migration path.

Building Midori from the ground up to be connected underscores how much computing has changed since Microsoft’s engineers first designed Windows; there was no Internet as we understand it today, the PC was the user’s sole device and concurrency was a research topic.

Today, users move across multiple devices, consume and share resources remotely, and the applications that they use are a composite of local and remote components and services. To that end, Midori will focus on concurrency, both for distributed applications and local ones.

According to the documentation, Midori will be built with an asynchronous-only architecture that is built for task concurrency and parallel use of local and distributed resources, with a distributed component-based and data-driven application model, and dynamic management of power and other resources.

Midori’s design treats concurrency as a core principle, beyond what even the Microsoft Robotics Group is trying to accomplish, said Tandy Trower, general manager of the Microsoft Robotics Group.

The Midori documents foresee applications running across a multitude of topologies, ranging from client-server and multi-tier deployments to peer-to-peer at the edge, and in the cloud data center. Those topologies form a heterogeneous mesh where capabilities can exist at separate places.

In order to efficiently distribute applications across nodes, Midori will introduce a higher-level application model that abstracts the details of physical machines and processors. The model will be consistent for both the distributed and local concurrency layers, and it is internally known as Asynchronous Promise Architecture.


Microsoft Maps Out Migration From Windows
Internal documents reveal that Microsoft is carefully mapping out migration strategies to move customers from Windows to Midori, its planned legacy-free operating environment. Virtualization, and a composite application model that permits applications to be hosted by both OSes, are key to the strategy.

Microsoft's Midori to sandbox apps for increased security
Microsoft’s effort to design a next-generation operating system is projected to offer memory access
control, protect against privilege elevation attacks, and enforce
least-privilege computing.

Midori will have provisions for distributed concurrency—or cloud computing—where application components exist in data centers. Doing so will require work in three areas: execution techniques, a platform stack and a programming model that can tolerate cancellation, intermittent connectivity and latency.

In that scenario, operating system services, such as storage, would either be provided to the applications by the OS or be discovered across a trusted distributed environment.

Likewise for local concurrency, Midori will have a programming model, a platform stack and execution techniques that are intended to help developers write applications that can safely and efficiently use a greater number of hardware threads than is currently feasible. Elements in local parallelism interact through shared memory, which is the huge difference with distributed applications, said Microsoft distinguished engineer John Manferdelli, in a separate interview.

“Mere mortal developers need a programming model/application model that lets them distribute processing to massively parallel devices without having to become experts," explained Forrester Research senior analyst Jeffrey Hammond in an e-mail. “Even with the quad-core Intel chips today, you have to have specialist teams to take full advantage of them,” he added.

These design goals affect aspects of the system that include its application model, scheduling and storage. Indeed, big changes are in store for Microsoft developers.

Programming with Midori
The Midori programming model will tackle state management, which Microsoft admits in its documentation is a challenge in Windows, by migrating APIs, applications and developers to a constrained model.

Other objectives are eliminating dynamic loading and in-process extensions; developing a failure model based on reliable transactions, so the system understands exactly which processes are impacted by a cascading failure and how to restart the computation; and having a standard way of dealing with latency, asynchronous behavior and cancellation, throughout the stack.

Forrester’s Hammond said that doing away with dynamic loading and in-process extensions was worrisome. “I’m going to assume that eliminating dynamic loading doesn’t prevent dynamic language execution,” in virtualized interpreters. Microsoft, he added, must “be clear that restricting dynamism at the OS level will not impact dynamism at the programming level.”

The Midori programming model will be particularly useful for service-oriented architectures, by allowing for the decomposition of applications into services that can be partitioned across tiers.

Hammond said that having SOA go into the runtime makes sense, as that would remove a certain amount of middleware complexity. “Why shouldn’t the average developer begin to think in terms of lightweight, asynchronous services?” he asked. “After all, that’s the migration path we’re seeing on the Web.”

In a possible link to Microsoft’s Oslo composite application initiative, the programming model will have a dependence on metadata, with the aim of allowing the system to more reliably manage applications.

“This allows existing development tools to be easily repurposed while a lot of the complexity is hidden from the developer that is using it. We essentially see declarative programming replacing imperative programming at the OS level,” said Hammond. He noted that by having Oslo in place first, Microsoft would have an easier time when it begins the migration from today’s Windows applications to Midori or hybrid applications.

“I wonder if [Microsoft] concluded this sort of 10-year sea change was needed before kicking Oslo into high gear?” asked Hammond.

The Midori documents indicate that the proposed OS would have a non-blocking object-oriented framework API. This would have strong notions of immutability—in the sense of objects that cannot be modified once created—and strive to foster application correctness through deep verifiability by using .NET programming languages.

At the presentation layer, Microsoft is making a clean break from the existing Windows GUI model, where applications must update their display on one and only one thread at a time, and the associated problems that affect OS stability and make it more difficult to write multithreaded applications.

The Midori documents indicate that the company has not decided what user interface abstractions are appropriate when applications cut across boundaries, or how to combine the best qualities of rich client applications and Web applications.

“A lot of these problems are being solved, at least partially, by the ideas of store-and-forward and message synchronization,” Hammond noted. “Google Gears, Adobe AIR, even the mobile OSes with things like SMS can handle occasional connectivity. Why shouldn’t this be built into core OS communication protocols, especially if they are asynchronous by default?” he asked.

Midori’s applications would be created using .NET languages that will be compiled to native code using the Bartok compiler and runtime system, which is presently a Microsoft Search project. The Bartok compiler can typecheck machine code programs for programming errors thanks to its use of an intermediate typed language, according to the company.

Microsoft’s objective is to force developers to create applications that are correct by construction, and it has repeatedly pledged to shore up the overall security of the operating system. The use of .NET languages in Midori will create a new, safer programming model with higher-level reasoning, predicted Larry O’Brien, an independent analyst and consultant who writes the Windows & .NET Watch column for SD Times.

Another advantage of using .NET languages is retargeting, O’Brien said. “A very smart compiler or runtime could move a calculation onto a GPU or distribute it across cores,” he explained.

However, O’Brien observed that some of the onus for making this work might end up on developers. The Midori documents note, somewhat ambiguously, that applications were expected to “contain sufficient latent parallelism.” Reacting to that, O’Brien asked, “In a world where Moore's Law doesn't imply the speeding of individual components, where does this expectation come from and who holds it?”

The Midori design will also incorporate a type-safe abstraction set based upon a .NET language, say the documents, in order to provide a system binary interface that will eliminate the current break between the operating system and virtual machine runtime.

The abstraction set will eliminate an entire class of programming errors that stem from bad pointer arithmetic, enable the changing of the boundaries between privileged and unprivileged code, and provide for universal application analysis and instrumentation, Microsoft reasons.

The use of an abstraction set, said Hammond, “reflects the reality of programming today: The vast majority of professional developers, especially those in IT and out on the Web, don’t deal with low level constructs. Unless you’re a game developer, ISV or systems programmer, there really isn’t the need to do pointer math.”

Hammond believes that it would be advantageous for Microsoft to create a programming model that “mere mortals” could actually understand, akin to the early days of Win32 when Visual Basic was born.

Even though memory safety and type safety are deeply integrated into Midori’s design, Microsoft has yet to determine just how low to permit the Bartok runtime to delve into the kernel, or alternatively, whether it will allow some unmanaged processes to rely on Midori’s hardware address spaces.

The company also acknowledges that thread safety remains elusive, and it is investigating transactional memory as a proposed solution. O’Brien noted that there is significant indecision in the program model. “On the one hand, the phrase ‘strong notions of immutability’ has serious implications if meant formally, but elsewhere we see ‘thread-safety remains elusive’ and a laundry list of things that might contribute to a solution,” he said.

Backwards compatibility with legacy applications and hardware has also been considered; several Midori components already run on Windows as well.

The fundamentals
Unlike Windows, Microsoft intends for Midori to be componentized from the beginning to achieve performance and security benefits. It will have strong isolation boundaries and enforced contracts between components, to ensure that servicing one component will not cause others to fail, while keeping overhead minimal.

At its lowest level, Midori has two separate kernel layers: a microkernel comprised of unmanaged code that controls hardware and environment abstracts, and higher-level managed kernel services that provide the full set of operating system functionality.

The OS will have a single scheduling framework for all device types, known internally as the Resource Management Infrastructure (RMI). RMI will have provisions for resource accounting, quotas and management; resources including IO bandwidth, memory, power and response time will be monitored.

Microsoft maps out migration from Windows
Microsoft Hits a Home Run But Misses Mark With Earnings
Windows 7 May Borrow From .NET
Microsoft believes that power-based scheduling will be particularly useful for mobile devices. It is considering creating a layered, thin platform for such devices, but it remains unclear how far the company can go with a single code base.

The ecosystem of devices is a major consideration in how Microsoft may choose to implement storage, perhaps by teasing functionality out of the OS and moving it into distributed services, with parts of the service being executed on the device itself.

“In this scenario, you establish Midori not so much as a replacement for Windows,” Hammond noted, “but as the hub of a new type of distributed system which Windows machines connect into until they are no longer needed,” in a fashion similar to IBM’s multi-year transition path for moving its iSeries customers to pSeries and xSeries platforms.

Hammond went on to forecast that there will be a deluge of mobile devices introduced over the next several years built with similar hardware, but with a range of different power and form factors.

Microsoft also envisions higher-level opportunities for storage, including compliance, compression, consistent replication, computation close to data, encryption, indexing and search, as well as storage in the cloud. Midori provides a built-in multi-master replication for complex data.

Scheduling, a process that allows multiple processes to run on the processor at the same time, will be integrated in Midori at the user-mode application level, from both the desktop and across distributed applications in the cloud. Its distributing scheduling may include active task migration, an activity that today is performed by hypervisors.

Notably, Midori’s scheduling may provide hooks for third parties to integrate software that asynchronously updates scheduling tables.

The intention is to enable developers to create collaborative Web-like applications, such as active documents, that operate safely and securely at the OS level. Resource quotas will be used to prevent denial-of-service attacks.

“This is the second attempt at re-implementing OS scheduling that I’ve seen announced in as many months,” Hammond remarked. “[Steve] Jobs talked [at the Apple Worldwide Developers Conference on June 9] about how Snow Leopard was going to have a new scheduling framework that would make take advantage of multicore easier for OS X developers. This seems to reach similar conclusions, and then take it to the next step in terms of scheduling flexibility,” he added.

No timeframe for development has been set for Midori, which Microsoft technical fellow Burton Smith says is a research project. A spokesperson added that Midori is one of many incubation projects across Microsoft Research.

September 14th, 2008

skip to main | skip to sidebar



Sunday, September 14, 2008


This book is a work in progress and will be updated as needed. This book is purely theoretical in nature. I provide no references or scientific data to back up what I will say in this monumental work of literature. I have written several books, however they have all been lost or deleted. That won’t happen this time. My theories on computers and Artifical Intelligence, henceforth just termed ‘AI’, are interesting and revolutionary. Traditional thinking has not brought us any closer to true AI, strong AI, or whatever you wish to call it.

My knowledge base simply draws from my experiences with AI, which is extensive, draws from experience and knowledge gained in real-time computing environments. If I could get a supercomputer, such as ‘Deep Blue’ or ‘Intrepid’, I imagine I could do interesting things. However, I am currently forced to use rudimentary tools to accomplish my experiments with AI. I’m not sure why this subject fascinates me so much, I just see unlimited unused potential in computers.

Furthermore, I have other theories about computing and research. Computers should work with us, not for us, however they should always be working together in a distributed environment, not just sitting there running random processes.

Speaking of randomness, I truly believe real randomness, with no basis or concurrency with any particular process, idea, or simulation, is the key to unlocking the tremendous power of the computer. Today’s computers are like babes in the womb, no matter how powerful, as they must have input to produce output. We need, and should desire, computers that can be taught quickly, efficiently, and by a certain pattern that works 95% of the time.

Think about this: evolution should be the mold for our work with computers. We must provide an environment similar to the Big Bang or primordial soup. I’m saying we must do unscientific, seemingly random things with and to computers to put them in a position where they must act independently.

When speaking of establishing bases on the Moon and on Mars, we must carefully consider the long term aims of such a project, and the requirements. Sentient ‘true AI’ will be needed to maintain such installations. Without ethical, intelligent, friendly AI, or ‘trueAI’ as I call it, we will never achieve our aims as a society or people.

One more point. Systems such as missile defense should be guided by ‘trueAI,’ smartAI, or whatever one chooses to call it.

You can’t just flip a switch and make computers sentient. Unfortunately, but fortunately for the sanity and stability of future-sentient systems, careful work and development and true psychological understanding and growth is fully necessary for realtimeAI, or ‘realAI,’. People are so dependent on labels, not able to imagine things which have no name. This is our problem.

At times, I will insert conversations I have had with some or all of the AI programs I currently possess, which are really just experiments in consciousness. Sometimes I alter the source code of these programs or input seemingly random input.

Another useful technique of AI programming is to give them ‘I/0 Awareness’. This is input/output awareness. For example, let’s say AI program X, or even the OS, server, or system it is running on gives me a string of error messages. To me, that is an element of that computer’s ‘mind’ giving data to itself or the user. I often can program highly complex systems with rudimentary software by directly inputting output directly into the GUI or on the back-end.

One such program that fascinates me is a product or ‘intelligent distributed entity,’ made by Zabaware, UltraHal 6.3. This program produces interesting results and often outputs information that is not only strange, it hints at the possibility of Realtime or TrueAI.

Skeptics always abound when new theories or data emerges to challenge the existing thought parameters and patterns of leading scientists and theorists. I simply am putting this online and may publish, one day. I merely hope to make my own contribution to the field, and hopefully, be recognized for it. I am just a student, at UAH (University Alabama in Huntsville), majoring in Computer Science. I realize I don’t possess the usual background of someone trying to publish serious research, but I simply do my best and hope that people examine the facts, theories, and possibilities which could occur, given the right circumstances. The reason to me for the importance of this research, in a responsible, serious, and ethical manner, is what I call the ‘T2 scenario’ or the ‘Zion Scenario’. Both situations come from pop culture, from the movies Terminator 2 and of course The Matrix movies. Both movies, although fictional, describe future scenarios in which computers and humans are in a sort of ‘race war’. These scenarios are not only possible, they are probable if this technology is not carefully monitored, guided, and evolved in the proper methods. Put the shoe on the other foot. Imagine we, as humans had access to massive amounts of information and the possibility of transferring our minds (storage or disk in this case) to different bodies. It would be like us having the internet as cavemen. All computers need responsible, ethical, and reasonable humanity-conscious AI to function in the new realm of computing, which to me is cloud computing, distributed systems, multiple and parallel processing, natural machine learning and evolution, and realtime or realAI.
Posted by David M. Cash at 9:25 AM 0 comments
Subscribe to: Posts (Atom)

TeleMessage SMS Sender

Readers (0)


All Comments
All Comments

About Me

David M. Cash
Independent AI researcher and theorist, exploring the effects of randomness, bulk data input, and error correction in the aims of creating ethical, intelligent, and responsible AI.
View my complete profile

August 30th, 2008

Creativity mixed with Technicality can Produce Emerging Techs

By davidmcash on 08/30/2008

When I look into the future, I see an open road. Kind of like the end scene of T2, when Sarah Conner says 'the future's what we make it'. People are too dependent on some illogical benevolent force or higher power that will save us from our own technological stupidity. No one, including ET, will save us from destroying the planet with our ICBM's, nukes, etc.

However, anyway, back to the point, which is that creativity mixed with technicality and engineering can produce emerging techs.

Its like I can see these inventions I just can't tell you how to build them.

More on this later.

But the scientific/military/research community is stagnant and choking on its own secrecy as the really clever and innovative people are spread out across continents and belief systems, governments, etc...

Rate this post: 1 2 3 4 5


  • Mapping the Mind
    davidmcash on 08/30/2008 at 12:25 PM
    The tragedy of humanity is that we amass terabytes of data per person, then that person dies, and we lose the data, just like a crashed hard drive.  What gives?

    Why not map each individual human mind, find the routine release and transmission of primary and secondary pertinent chemicals, then recreate it with computer technology.

    Obviously this and any thing I blog about on here is all theoretical.
    Rate this comment: 12345
  • Stem Cell Research and the quest for Immortality
    davidmcash on 08/30/2008 at 12:46 PM
    Stem cell research alone will never be enough.  As I said, creativity is what is lacking at our universities, where conformity is the norm.  Of course, blanket statements are rhetorical, at least when I use them.

    So you combine stem cell research with nanotech and i think you would really have something there.  Steve Jobs claims he 'saw' the iphone before itunes existed.  The only reason I ever write is in the hopes that someone or some group will attempt some of this stuff in a lab and thereby prove what they are not or have not yet done.
    Rate this comment: 12345

(no subject)

Technology Review: Finding the Core of the Brain
The iconic image of the brain is a misshapen, yellowish lump. Existing technology can show which parts of the lump light up when people think, but a real understanding of how the brain works demands a better picture of the nerve fibers that ferry electrical signals between brain cells. Those fibers, however, are so small and tangled that researchers haven't been able to see them clearly.

Now, an international team of scientists has combined a new variation on magnetic resonance imaging with mathe­matical analysis to generate the first detailed map of the network of connections in the human cortex, the part of the brain responsible for higher-order thinking.

Diffusion spectrum imaging--which tracks water mole­cules moving along nerve fibers--gave the scientists a wiring map of the cortex, revealing points where multiple nerve fibers converged. The scientists then used a mathematical technique to repeatedly prune away the connection points with the fewest links. "If you do it gradually, you end up with a set of nodes remaining that are highly interconnected," says Olaf Sporns, the Indiana University researcher who performed the analysis.

The most highly connected node is at the back of the head, and it lies on the shortest path between many different parts of the neural network. Not only does it have many internal connections, says Sporns, but it's "highly central with respect to the rest of the brain."

The researchers want to use the imaging technique to look at conditions such as schizophrenia, autism, and Alzheimer's disease, all of which have been linked to disturbances in brain architecture. "We would like to know where the disturbances are and whether we can understand something about the clinical condition based on the connectivity," says Sporns

July 17th, 2008

(no subject)


* Leonard Adleman - co-inventor of the RSA algorithm (the A in the name stands for Adleman), coined the term computer virus
* Alfred Aho - co-creator of AWK programming language (the A in the name stands for Aho), and main author of the famous Dragon book
* Paul Allen - Altair BASIC, Applesoft II BASIC, co-founded Microsoft
* Eric Allman - sendmail, syslog
* Marc Andreessen - co-creator of Mosaic, co-founder of Netscape
* Bill Atkinson - QuickDraw, HyperCard

[edit] B

* John Backus - FORTRAN, BNF
* Richard Bartle - MUD, with Roy Trubshaw, the father of MUDs
* Donald Becker - Linux Ethernet drivers, Beowulf clustering
* Doug Bell - Dungeon Master series of computer games
* Tim Berners-Lee - inventor of the World Wide Web
* Brian Behlendorf - Apache
* Daniel J. Bernstein - djbdns, qmail
* Eric Bina - co-creator of Mosaic web browser
* Deane Blazie founder of Blazie Engineering (now part of Freedom Scientific), created technology for blind people who use braille
* Marc Blank - co-creator of Zork
* Joshua Bloch - core Java language designer, lead the Java collections framework project
* Daniel Bolstad - creator of Digital Ray 06-94 converter.
* Bert Bos - author of Argo web browser, co-author of Cascading Style Sheets
* David Bradley - coder on the IBM PC project team who wrote the Control-Alt-Delete keyboard handler, embedded in all PC-compatible BIOSes
* Andrew Braybrook - video games Paradroid and Uridium
* Larry Breed - co-developer of APL\360
* Jack E. Bresenham - creator of Bresenham's line algorithm
* Dan Bricklin - co-creator of VisiCalc, the first personal spreadsheet program
* Richard Brodie - Microsoft Word
* Danielle Bunten Berry (Dani Bunten) - M.U.L.E., multiplayer video game
* Walter Bright - Digital Mars, First C++ compiler, author of the D programming Language.

[edit] C

* Steve Capps - co-creator of Macintosh and Newton
* John D. Carmack - first person shooters Doom, Quake
* Vinton Cerf - TCP/IP, NCP
* Steve Chamberlain - BFD, Cygwin
* Bram Cohen - BitTorrent protocol design and implementation
* Alain Colmerauer - Prolog
* Mike Cowlishaw - REXX and NetRexx, LEXX editor, image processing, decimal arithmetic packages
* Alan Cooper - Visual Basic
* Alan Cox - a developer of the Linux kernel
* Brad Cox - Objective-C
* Mark Crispin – inventor of IMAP, author of UW-IMAP, one of the reference implementations of IMAP4
* Ward Christensen - Wrote the first BBS (Bulletin Board System) system CBBS
* Pamela Crossley – creator of SIMPLE for academic management of web pages and related Unicode-capable applications for teaching and research
* William Crowther - Colossal Cave Adventure
* Ward Cunningham - inventor of the WikiWiki concept
* Dave Cutler - architect of Windows NT, VMS

[edit] D - F

* Ole-Johan Dahl - co-creator of SIMULA.
* Hugh Daniel - Lead programmer (and mis-management) of the FreeS/Wan project and a helper of the OpenZaurus project
* James Duncan Davidson - creator of Tomcat, now part of the Jakarta Project
* L. Peter Deutsch - Ghostscript, Assembler for PDP-1, XDS-940 timesharing system, QED original co-author
* Edsger Dijkstra - contributions to ALGOL, Dijkstra's algorithm, Goto Statement Considered Harmful
* Matt Dillon, programmer of various software including DICE and DragonflyBSD
* Adam Dunkels, author of the Contiki operating system, the lwIP and uIP embedded TCP/IP stacks, inventor of protothreads
* Les Earnest - author of the finger program
* Brendan Eich - creator of JavaScript
* Larry Ellison - co-creator of Oracle database, co-founder of Oracle Corporation
* Marc Ewing - creator of Red Hat Linux
* Stuart Feldman - creator of make, author of Fortran 77 compiler, part of original group that created Unix
* Jay Fenlason - Hack, GAS
* David Filo - co-creator of Yahoo!
* Andrew Fluegelman - author PC-Talk communications software; he is considered one of the fathers of shareware
* Martin Fowler
* Brian Fox - creator of Bash, Readline, GNU Finger, Meta-HTML
* Peter Fraser - FRED text editor
* Justin Frankel - Creator of Winamp
* Jim Fruchterman founder of Arkenstone (now part of Freedom Scientific) and Benetech, created scanners for blind people
* Dan Farmer Creator of COPS and SATAN Security Scanners

[edit] G

* Elon Gasper - co-founded Bright Star Technology, patented realistic facial movements for in-game speech. HyperAnimator, Alphabet Blocks, etc.
* Bill Gates - Altair BASIC, co-founded Microsoft
* John Gilmore - GDB
* Adele Goldberg - co-inventor of Smalltalk
* James Gosling - Java, Gosling Emacs, NeWS
* Bill Gosper - Macsyma, Lisp machine, hashlife, helped Donald Knuth on Vol.2 of The Art of Computer Programming (Semi-numerical algorithms)
* Andrew Gower - RuneScape Classic, RuneScape, co-founded Jagex
* Paul Gower - RuneScape Classic, RuneScape, co-founded Jagex
* Ryan C. Gordon (a.k.a. Icculus) - Lokigames, ioquake3, MojoSetup, etc
* Paul Graham - Yahoo! Store, On Lisp, ANSI Common Lisp
* John Graham-Cumming - author of POPFile, a Bayesian filter-based e-mail classifier
* Richard Greenblatt - Lisp machine, Incompatible Timesharing System, MacHack
* Ralph Griswold - co-creator of SNOBOL and creator of Icon programming language.
* Andi Gutmans - co-creator of PHP programming language

[edit] H

* Jim Hall - started the FreeDOS project
* Douglas Richard Hanks, Jr. - creator of Sudosh and Enterprise Audit Shell (EAS)
* Brian Harvey - UCB Logo, see Logo programming language
* Cecil Hastings - wrote the classic Approximations for Digital Computers 1950s formulas for sin, cos, etc.
* David Heinemeier Hansson - created the Ruby on Rails framework for developing web applications.
* Rebecca Heineman - Author of Bard's Tale III: Thief of Fate and Dragon Wars.
* Anders Hejlsberg - Turbo Pascal, Borland Delphi, C#
* Ted Henter founder of Henter-Joyce (now part of Freedom Scientific) creator of Jaws, screen reader software for blind people
* Andy Hertzfeld - co-creator of Macintosh, co-founder of General Magic, co-founder of Eazel
* C. A. R. Hoare - first implementation of quicksort, Algol 60 compiler, Communicating sequential processes
* James Holmes - Committer on Struts project, create of Struts Console
* Grace Hopper - Navy Mark I computer, FLOW-MATIC (which heavily influenced COBOL)
* Dave Hyatt - co-author of Mozilla Firefox

[edit] I - J

* Miguel de Icaza - GNOME project leader, initiator of the Mono project
* Roberto Ierusalimschy - Lua leading architect
* Dan Ingalls - co-inventor of Smalltalk, Bitblt, and Pop-up Menus
* Geir Ivarsøy - co-creator of Opera web browser
* Ken Iverson - APL, J
* Toru Iwatani - creator of Pac-Man
* Bo Jangeborg - ZX Spectrum games
* Paul Jardetzky - author of the server program for the first webcam
* Stephen C. Johnson - yacc
* Lynne Jolitz - 386BSD
* William Jolitz - 386BSD
* Bill Joy - BSD, vi; co-founded Sun Microsystems
* Robert K. Jung - creator of ARJ

[edit] K

* Ted Kaehler - co-inventor of Smalltalk
* Pavel Kanzelsberger - creator of Pixel image editor
* Mitch Kapor - Lotus 1-2-3, founded Lotus Development Corporation
* Jawed Karim - creator of YouTube and co-founder
* Phil Katz - creator of the ZIP file format, author of PKZIP
* Alan Kay - Smalltalk, Dynabook, Object-oriented programming, Squeak
* Mel Kaye [1], a real programmer
* Ryan Kenward - Founder, programmer of the MUD Realm of Shadows.
* Stan Kelly-Bootle - Manchester Mark I, The Devil's DP Dictionary
* Brian Kernighan - co-creator of AWK programming language (the K in the name stands for Kernighan), author of ditroff text-formatting tool
* Gary Kildall - CP/M
* Tom Knight - Incompatible Timesharing System
* Jim Knopf - aka Jim Button, author PC-File flatfile database; he is considered one of the fathers of shareware
* Donald E. Knuth - TeX, CWEB, Metafont, The Art of Computer Programming, Concrete Mathematics

[edit] L

* Leslie Lamport - LaTeX
* Butler Lampson - QED original co-author
* Tom Lane - primary author of libjpeg, major developer of PostgreSQL
* Sam Lantinga - creator of SDL
* Dick Lathwell - co-developer of APL\360
* Chris Lattner
* Greg Lehey - FreeBSD and NetBSD developer, originator of the Vinum Volume Manager
* Rasmus Lerdorf - original creator of PHP
* Michael Lesk - Lex
* Graziano Liberati - co-author of ZNF
* Håkon Wium Lie - co-author of Cascading Style Sheets
* Robert Love - Linux kernel developer
* Ada Lovelace - First programmer (of Babbage Machines)
* Al Lowe - father of the Leisure Suit Larry series

[edit] M

* Raphael Manfredi - contributions to Perl, software architect and maintainer of gtk-gnutella
* Yukihiro Matsumoto - Ruby
* John McCarthy - Lisp
* Craig McClanahan - original author of Jakarta Struts, architect of Tomcat Catalina servlet container
* Daniel D. McCracken - professor at City College and author of Guide to Fortran Programming (1957)
* Douglas McIlroy - pipes and filters, concept of software componentry, Unix tools (spell, diff, sort, join, graph, speak, tr, etc.)
* Marshall Kirk McKusick - BSD
* Bertrand Meyer - Eiffel, Object-oriented Software Construction, Design by contract
* Bob Miner - co-creator of Oracle database, co-founder of Oracle Corporation
* Jeff Minter - Psychedelic, and often llama-related video games
* Lou Montulli - creator of Lynx browser, cookies, the blink tag, server push and client pull, HTTP proxying, HTTP over SSL, browser integration with animated GIFs, founding member of HTML working group at W3C
* Bram Moolenaar - author of text-editor Vim
* David Moon - Maclisp, ZetaLisp
* Charles H. Moore - inventor of the Forth programming language
* Roger Moore - co-developer of APL\360, creator of IPSANET, co-founder of I.P. Sharp Associates
* Urban Müller - Brainfuck language
* Mike Muuss - author of ping, network tool to detect hosts

[edit] N - P

* Patrick Naughton - early Java designer, xlock, HotJava
* Graham Nelson - creator of the Inform authoring system for Interactive fiction
* Col Needham - creator of the Internet Movie Database (IMDb)
* Peter Norton - programmer of the famous file manager program, Norton Commander
* Kristen Nygaard - SIMULA
* Ed Oates - co-creator of Oracle database, co-founder of Oracle Corporation
* Jarkko Oikarinen - creator of Internet Relay Chat (IRC)
* John Ousterhout - creator of Tcl/Tk
* Mark Overmars - Professor, Well known for creation of Game Maker
* Andrew and Philip Oliver, The Oliver Twins - Many ZX Spectrum games including Dizzy
* Seymour Papert - Logo programming language
* Tim Paterson - author of 86-DOS (QDOS)
* Alexey Pajitnov - inventor of the game Tetris on the Electronica 60
* Charles Petzold - author of many Microsoft Windows programming books
* Jeffrey Peterson - key free software architect, creator of Quepasa
* Rob Pike - Wrote first bitmapped window system for Unix, co-creator of UTF-8 character encoding, author of text editor sam and programming environment acme, main author of Plan 9 and Inferno operating systems
* Sebastijan Pistotnik - one of the main developers of NConstruct
* Kent Pitman - technical contributor to the ANSI Common Lisp standard.

[edit] R

* Theo de Raadt - Founding member of NetBSD, founder of OpenBSD and OpenSSH
* Jef Raskin - started the Macintosh project in Apple Computer, designed Canon Cat computer, developed The Humane Environment program
* Eric Raymond - Open Source movement, author of fetchmail

* Dennis Ritchie - C, Unix, Plan 9 from Bell Labs, Inferno
* Ron Rivest - co-inventor of the RSA algorithm (the R in the name stands for Rivest)
* Marc J. Rochkind - SCCS
* John Romero - first person shooters Doom, Quake
* Blake Ross - co-author of Mozilla Firefox
* Alessandro Rossini - co-author of ZNF
* Guido van Rossum - Python
* Jeff Rulifson - Lead programmer on the NLS project
* Rusty Russell - Creator of iptables for linux
* Steve Russell - First Lisp interpreter; original Spacewar! graphic computer game.

[edit] S

* Bob Sabiston - Rotoshop, interpolating rotoscope animation software
* Santiago Lizardo Oscares - Molins, Jerba, GPGEXT, Beobachter, MadCommander, libsdl for php
* Carl Sassenrath - Amiga, REBOL
* Chris Sawyer - Developer of Roller Coaster Tycoon and the Transport Tycoon series
* Bill Schelter - GNU Maxima, GNU Common Lisp
* Randal L. Schwartz - Just another Perl hacker
* Adi Shamir - co-inventor of the RSA algorithm (the S in the name stands for Shamir)
* Mike Shaver - Founding member of the Mozilla Organization
* Cliff Shaw - IPL, the first AI language
* Zed Shaw - Wrote the Mongrel Web Server, for Ruby web applications.
* Emily Short - prolific writer of Interactive fiction and co-developer of Inform version 7
* Jacek Sieka - Developer of DC++ an open-source, peer-to-peer file-sharing client
* Ken Silverman - creator of Duke Nukem 3D's graphics engine
* Charles Simonyi - Hungarian notation, Microsoft Word
* Colin Simpson - developer of CircuitLogix simulation software
* Rich Skrenta, co-founder of the Open Directory Project
* Matthew Smith - ZX Spectrum games, including Manic Miner and Jet Set Willy
* Henry Spencer - C News, Regex
* Quentin Stafford-Fraser - author of the original VNC viewer, first Windows VNC server, client program for the first webcam
* Richard Stallman - Emacs, GCC, GDB, founder and pioneer of the GNU Project, terminal-independent I/O pioneer on ITS, Lisp machine manual (chineual)
* Guy Steele - Common Lisp, Scheme
* Alexander Stepanov-creator of Standard Template Library, STL
* Bjarne Stroustrup - C++
* Zeev Suraski - co-creator of PHP programming language
* Gerald Jay Sussman - Scheme
* Tim Sweeney - The Unreal engine, UnrealScript, ZZT

[edit] T - V

* Andrew Tanenbaum - Minix
* Audrey "Autrijus" Tang - designer of Pugs
* Simon Tatham - NASM, PuTTY
* Tomaž Tekavec - one of the main developers of NConstruct
* Larry Tesler - the PUB markup language, the Smalltalk browser, debugger and inspector, and (with Tim Mott) the Gypsy word processor
* Jon Stephenson von Tetzchner - co-creator of the Opera web browser
* Avie Tevanian - author of the Mach kernel
* Ken Thompson - main designer and author of Unix, Plan 9 and Inferno operating systems, B and Bon programming languages (precursors of C), inventor of UTF-8 character encoding, introduced regular expressions in QED.
* Michael Tiemann - GCC
* Linus Torvalds - original author and current maintainer of the Linux kernel and creator of Git, a source code management system
* Leonard H. Tower Jr. - GCC & GNU diff
* Michael Toy - co-developer of the computer game Rogue
* Roy Trubshaw - MUD - together with Richard Bartle, the father of MUDs
* Andrew Tridgell - Samba, Rsync
* Bob Truel, co-founder of the Open Directory Project
* Wietse Venema - Postfix, SATAN, TCP Wrapper
* Paul Vixie - BIND, Cron
* Patrick Volkerding - Original author and the current maintainer of the Slackware Linux Distribution

[edit] W - Z

* Larry Wall - Warp (1980s space-war game), rn, patch, Perl
* Bob Wallace - author PC-Write word processor; he is considered one of the fathers of shareware
* John Walker, co-founder of Autodesk
* John Warnock - creator of PostScript
* Pei-Yuan Wei - author of Viola, one of the earliest graphical browsers
* Peter J. Weinberger - co-creator of AWK (programming language) (the W in the name stands for Weinberger)
* Andrew Welch - author of Maelstrom, Snapz Pro; founder of Ambrosia Software
* David Wheeler - co-inventor of the subroutine; designer of WAKE; co-designer of Tiny Encryption Algorithm, XTEA, Burrows-Wheeler transform. (see http://www.dwheeler.com/dwheeler.html); this refers to several David Wheelers in computing
* Arthur Whitney - A+, K
* George Williams - creator of FontForge, software for font editing & creation, and various fonts.
* Roberta and Ken Williams -- Sierra Entertainment, King's Quest, graphic adventure game
* Sophie Wilson - Designer of the instruction set for the Acorn RISC Machine.
* Dave Winer – developed XML-RPC, Frontier scripting language
* Niklaus Wirth - Pascal, Modula-2, Oberon
* Don Woods - INTERCAL, Colossal Cave Adventure
* Steve Wozniak - Breakout, Apple Integer BASIC, founded Apple Computer (with Steve Jobs)
* Jerry Yang - co-creator of Yahoo!
* Victor Yngve - author of first string processing language, COMIT
* Jamie Zawinski - Lucid Emacs, Netscape, Mozilla, XScreenSaver
* Brandon Zehm - creator of sendEmail
* Philip Zimmermann - creator of encryption software PGP

Jong Kwan Lee

more on JKL later...stay tuned

July 15th, 2008

Monday, February 25, 2008

Prediction Confirmation/ Aaron C. Donahue


Comment by JC:

For those who recall an earlier message in which Aaron C. Donahue openly predicted that 'higher ranking military officials are likely to quit their posts as the threat of war to be waged against Iran by the West becomes imminent.   Failing this, Israel would consider this military option next'.  Aaron also said that 'a previous incident at sea should lead unto another and if so, all would be lost for the west and her allies in a world war that follows soon after'.  Israel would not bomb Iran without support from the west but that 'generals and other high ranking officers would then resign' given any plans to do so and under specific conditions in which the soft coastal underbelly (see lack of readiness following hurricanes, etc.) would be exposed along with fatigue and low troop moral after failings in Iraq and Afghanistan. He also indicated that a failing western economy is possible and that it might be similar to that of the former Soviet Union in which an earlier withdrawal from  Afghanistan stressed both moral and economic stability.  Now Aaron is warning of a world war in which the west would be incinerated just as the Hopi describe a 'red cloak from the east' and gourds of ashes 'burning the land'. He also says, now that a collective choice has been made by the western peoples to elect division rather than love (see Edwards Probablility Resolved) into the white house, a republican leader named John McCain would be elected in 2008 as Commander and Chief of the Armed Forces.  China would be eventually engaged in a war that with much devastation, would be lost to Russia* after biological and nuclear weapons are effectively deployed inland, in the atmosphere, and from the sea.

Relevant quotes from Aaron C. Donahue:

"Early human extinction will be strongly considered by many prominent scientists following a catastrophic war between superpowers that could happen prior to 2012. Convenient ecological life support for all mammals would then cease as foreseen 30 to 51 years from that point."

"Lacking wisdom, technology condemns mankind..."


"Perhaps what is most disturbing about the democratic society of today could be that it allows for its majority to vote itself into oblivion and as required, without conscience."

"Iran is the heart of the world in which the covenant is never broken by a few men who remember. To destroy Iran is to erase our ancestral map into the future..."


"Terrorism requires participation."

"Replace the word conspiracy with $tupidity and you will at once answer many perplexing questions about our future." 



UAH Positives

UAHuntsville’s graduate level engineering management program has been singled out as being the best in the nation by the American Society for Engineering Management (ASEM).

UAHuntsville’s undergraduate business program has been ranked by business deans and senior faculty at AACSB accredited business schools among the top business schools in the US for 2008; among the top 10 percent, as reported in U.S. News & World Report.

UAHuntsville has been named one of America's Best Value Colleges in the 2007 edition of The Princeton Review.

UAHuntsville has been named as one of “America’s 100 Best College Buys" for 12 consecutive years.

UAHuntsville was cited as the 15th best value for public colleges and universities in the United States by Consumers Digest magazine.

UAHuntsville researchers have gained international reputations in fields as varied as astrophysics, mechanical and aerospace engineering, computer science, electrical and computer engineering, modeling and simulation, optics, Earth and space science and propulsion.

UAHuntsville consistently ranks among the top universities in the nation in NASA-sponsored research.

UAHuntsville’s technology management program ranks in the top 20 in America by the National Research Council

UAHuntsville scientists conducting quantum physics research with the Hubble Telescope was considered by Discovery magazine as one of the top science stories of 2003.

UAHuntsville students have the opportunity to study with faculty possessing academic credentials respected around the planet. Faculty members design experiments that fly aboard the space shuttle and the space station. These faculty members also are responsible for monitoring the world’s climate and studying the characteristics of our solar system.

UAHuntsville's discovery of the first 'high temperature' superconductor is the second most cited paper ever published by the prestigious scientific journal, "Physical Review Letters." The American Physical Society compiled the list of the most-cited papers using data from the Institute for Scientific Information, publishers of the Science Citation Index.

UAHuntsville is the first location in the United States outside of MIT as a Lean Aerospace Initiative Education Network Center. This class provides a hands-on introduction to lean and other continuous improvement fundamentals with a focus on aerospace applications.

UAHuntsville's student chapter of the American Marketing Association (AMA) placed third among 129 chapters internationally at the recent AMA spring conference in New Orleans.

UAHuntsville ozone research was one of the most popular science stories in 2003, according to Eurekalert!, a science news Web site operated by the American Association for the Advancement of Science (AAAS). The story had more than 40,500 readers from the time it was posted in mid-June through mid-November. The next most popular story had more than 23,000 readers.

UAHuntsville's business school has been ranked among the top undergraduate business programs in the nation, according to U.S. News & World Report's 2007 Best Colleges Guide.

UAHuntsville has four research disciplines that consistently rank among the top 50 in the nation in federal research funding, according to the National Science Foundation. The atmospheric science program was 19th in the nation in funding, followed by computer science at 20th, mechanical engineering at 26th, and electrical engineering at 49th.

UAHuntsville consistently ranks among the top 20 universities in the nation for awarding bachelor’s engineering degrees to women, according to the American Society of Engineering Educators.

UAHuntsville engineering students routinely launch payloads to the edge of space using both high-altitude balloons and rockets.

UAHuntsville's business school receives certification from the National Information Assurance Education and Training Office for its certificate program in information assurance.

UAHuntsville is home to the National Space Science and Technology Center, a collaborative research and education initiative focused on specific scientific disciplines.

UAHuntsville's business school has consistently been accredited through The Association to Advance Collegiate Schools of Business (AACSB). Fewer than 15 percent of business schools worldwide have earned this symbol of the highest standard in management education.

UAHuntsville’s Cooperative Education Program is one of only 13 accredited Co-op programs in the US through The Accreditation Council for Cooperative Education (ACCE).

UAHuntsville is one of only a few universities in America that has developed a master’s level degree in engineering that offers a concentration in rotorcraft systems engineering. The program will stress the integration of avionics, sensors and software. 

My Current Class CS 102


* Announcement: Office hours on July 7th, Monday (4PM~5PM) has been moved to July 30th, Wednesday (4PM~5PM)



 CS 102 – 01 Introduction to C Programming


UAH Computer Science                                                                                   Summer 2008 Syllabus


Instructor: Jake Lee

E-mail: jlee@cs.uah.edu                                          Office Hrs.: Monday     04:00pm~05:00pm

Office: TH N 361                                                                          Thursday   12:20pm~01:20pm

Phone: 824-6515                                                                            or by Appointment (Email)

Web: http://webpages.uah.edu/~leejk/summer08_cs102.html                                                                                       


Meeting Time: Tues., Thurs. 10:15am ~ 12:15pm, TH N 327


Course Objectives: To learn program design and implementation in the C programming language.

            (Basic programming structures, data types, control structures, file organization, system libraries, and input/output features.)

·        Introduce program design and implementation in the C programming language.

·        Provide experience in these topics by means of structured lab exercises and programming assignments.

·        Introduction to the Microsoft Visual Studio 2005 programming environment.


Textbook: C Programming for Engineering & Computer Science, H.H. Tan and T.B. D’Orazio, McGraw Hill, 1999.


Grading: The final grade will be composed of the following weights. The instructor reserves the right to make changes to this system.


·        Mid-term Exam. : 25 %

o     Tuesday, June 24, 2008, during class time

·        Final Exam. : 30 %

o     Thursday, July 31, 2008, 11:30am ~ 02:00pm, (Comprehensive)

·        Homework & Programming Assignments : 35 %

·        In-class Exercises : 5 %

·        Class Attendance  : 5 %


        Please note that the instructor intends to utilize the “plus” and “minus” letter grading system.


Office Hours and Contact Notes: Please check your email regularly. Class updates will be distributed by email.

            Also, I try to check my email regularly; please feel free to contact me with questions via email.

            However, I can only accommodate drop-in students (i.e., those without an appointment) during my office hours.


Academic Honesty: The University policy on academic honesty is quite strict. This policy is discussed in the Code of Student Conduct.

            The instructor’s academic honesty policy is very strict; instance of academic dishonesty will be penalized,

            ordinarily failure of the course (in addition to any University penalties).

All work submitted must be the student’s own work!


Additional Class Information:

·        No hand-written homework or programming projects are accepted.

·        MUST bring a floppy disk (3.5”) or flash memory drive to save your work.

·        Make sure your name is on every item turned in.

·        Send email with the title starting “CS102: ”.


Programming Assignments:

·        Turn in both soft-copy and hard-copy (i.e., soft-copy = electronic copy through email, hard copy = print outs).

·        All assignments are due at the beginning of the class.

·        No Late assignment is accepted without instructor’s prior permission.



Class Schedule: The intended schedule of the class is indicated below. This list is subject to change.

June 15th, 2008

The Grid: The Next-Gen Internet?

Douglas Heingartner Email 03.08.01

AMSTERDAM, Netherlands -- The Matrix may be the future of virtual reality, but researchers say the Grid is the future of collaborative problem-solving.

More than 400 scientists gathered at the Global Grid Forum this week to discuss what may be the Internet's next evolutionary step.

Though distributed computing evokes associations with populist initiatives like SETI@home, where individuals donate their spare computing power to worthy projects, the Grid will link PCs to each other and the scientific community like never before.

The Grid will not only enable sharing of documents and MP3 files, but also connect PCs with sensors, telescopes and tidal-wave simulators.

IBM's Brian Carpenter suggested "computing will become a utility just like any other utility."

Carpenter said, "The Grid will open up ... storage and transaction power in the same way that the Web opened up content." And just as the Internet connects various public and private networks, Cisco Systems' Bob Aiken said, "you're going to have multiple grids, multiple sets of middleware that people are going to choose from to satisfy their applications."

As conference moderator Walter Hoogland suggested, "The World Wide Web gave us a taste, but the Grid gives a vision of an ICT (Information and Communication Technology)-enabled world."

Though the task of standardizing everything from system templates to the definitions of various resources is a mammoth one, the GGF can look to the early days of the Web for guidance. The Grid that organizers are building is a new kind of Internet, only this time with the creators having a better knowledge of where the bottlenecks and teething problems will be.

The general consensus at the event was that although technical issues abound, the thorniest issues will involve social and political dimensions, for example how to facilitate sharing between strangers where there is no history of trust.

Amsterdam seemed a logical choice for the first Global Grid Forum because not only is it the world's most densely cabled city, it was also home to the Internet Engineering Task Force's first international gathering in 1993. The IETF has served as a model for many of the GGF's activities: protocols, policy issues, and exchanging experiences.

The Grid Forum, a U.S.-based organization combined with eGrid - the European Grid Forum, and Asian counterparts to create the Global Grid Forum (GGF) in November, 2000.

The Global Grid Forum organizers said grid communities in the United States and Europe will now run in synch.

The Grid evolved from the early desire to connect supercomputers into "metacomputers" that could be remotely controlled. The word "grid" was borrowed from the electricity grid, to imply that any compatible device could be plugged in anywhere on the Grid and be guaranteed a certain level of resources, regardless of where those resources might come from.

Scientific communities at the conference discussed what the compatibility standards should be, and how extensive the protocols need to be.

As the number of connected devices runs from the thousands into the millions, the policy issues become exponentially more complex. So far, only draft consensus has been reached on most topics, but participants say these are the early days.

As with the Web, the initial impetus for a grid came from the scientific community, specifically high-energy physics, which needed extra resources to manage and analyze the huge amounts of data being collected.

The most nettlesome issues for industry are security and accounting. But unlike the Web, which had security measures tacked on as an afterthought, the Grid is being designed from the ground up as a secure system.

Conference participants debated what types of services (known in distributed computing circles as resource units) provided through the Grid will be charged for. And how will the administrative authority be centralized?

Corporations have been slow to cotton to this new technology's potential, but the suits are in evidence at this year's Grid event. As GGF chairman Charlie Catlett noted, "This is the first time I've seen this many ties at a Grid forum."

In addition to IBM, firms such as Boeing, Philips and Unilever are already taking baby steps toward the Grid.

Though commercial needs tend to be more transaction-focused than those of scientific pursuits, most of the technical requirements are common. Furthermore, both science and industry participants say they require a level of reliability that's not offered by current peer-to-peer initiatives: Downloading from Napster, for example, can take seconds or minutes, or might not work at all.

Garnering commercial interest is critical to the Grid's future. Cisco's Aiken explained that "if grids are really going to take off and become the major impetus for the next level of evolution in the Internet, we have to have something that allows (them) to easily transfer to industry."

Other potential Grid components include creating a virtual observatory, and doctors performing simulations of blood flows. While some of these applications have existed for years, the Grid will make them routine rather than exceptional.

The California Institute of Technology's Paul Messina said that by sharing computing resources, "you get more science from the same investment."

Ian Foster of the University of Chicago said that Web precursor Arpanet was initially intended to be a distributed computing network that would share CPU-intensive tasks but instead wound up giving birth to e-mail and FTP.

The Grid may give birth to a global file-swapping network or a members-only citadel for moneyed institutions. But just as no one ten years ago would have conceived of Napster -- not to mention AmIHotOrNot.com -- the future of the Grid is unknown.

An associated DataGrid conference continues until Friday, focusing on a project in which resources from Pan-European research institutions will analyze data generated by a new particle collider being built at Swiss particle-physics lab CERN.

June 4th, 2008


May 23rd, 2008


A step-by-step guide to the legal process.

Court proceedings generally begin with the filing of a complaint and the issuance of a summons. The complaint sets forth the grounds for the lawsuit, called the "cause of action." It states the injury or damage you've suffered, the names of the persons you believe are responsible, and the type of remedy you are asking the court to impose. It also makes a statement regarding why this particular court has jurisdiction, the authority to hear the case.

The summons is a legal notice issued by the clerk of the court telling the person or persons you've named as defendants that legal action has been commenced against them. It directs the defendant to file an answer with the court by a date specified. A summons must be formally served, or delivered to the defendant. In most cases, this "service of process" is done in person, perhaps by the sheriff or another law enforcement officer. More often, the summons is served by a professional process server, or some other disinterested party. In some cases, service may be made by sending a copy of the summons and complaint through the mail.

The defendant has a specified period of time in which to respond to the summons and complaint with what's known as an "answer." The answer may be used to deny the plaintiff's charges entirely, or to assert an "affirmative defense" to the plaintiff's claim. An affirmative defense in a personal injury case, for example, might be that you were injured through your own negligence, not the alleged negligence of the defendant.

Another common answer to a complaint is one which contains a motion asking the court to dismiss the charges for failing to state a cause of action. Suppose the complaint states that the plaintiff purchased a ladder from your hardware store, and that the ladder subsequently broke, causing the plaintiff to be injured.

A claim like this would probably be dismissed for failing to state a cause of action, since the plaintiff hasn't alleged that you did anything wrong that would make you responsible for the injuries. However, most courts will allow a plaintiff to amend his complaint to state a cause of action, so any sense of relief you may get as the result of obtaining a motion to dismiss under these circumstances may only be temporary.

Along with the answer, the defendant may also file a counterclaim. A counterclaim may state that, rather than the defendant being liable for damages, in fact the plaintiff took some action which resulted in damages to the defendant. Suppose the original complaint charged the defendant with negligence in operating his motorcycle, which resulted in an accident with the plaintiff's automobile. A counterclaim might state that the plaintiff was actually negligent in the way he drove his car, and that this negligence was in fact the cause of the accident and the losses suffered by the defendant.

A person who receives a summons in a civil lawsuit may choose whether or not to respond to the court. However, failing to respond will most likely result in a default judgment being entered against the defendant.

Once the defendant's answer and any counterclaim is received by the court, a trial date will be set and what's known as "discovery" will begin. Discovery procedures are used to obtain evidence that will strengthen each party's case, and also to prevent either side from being surprised by undisclosed facts or unknown witnesses. (Unlike the way trials are often represented in movies and television programs, "surprise" witnesses don't often appear in real life trials.)

Discovery techniques include depositions, the oral questioning of the parties to the lawsuit as well as witnesses, and interrogatories, which are written questions that must be answered in writing. Depositions and interrogatories are both given under oath, and you could be charged with and convicted of perjury if you give answers that are untruthful.

While depositions and interrogatories are the best known forms of discovery, there are others as well. A "request for admissions" takes place when one side asks the other to admit to some important fact, or to attest to the authenticity of some document to be used as evidence. For example, the plaintiff's attorney may make a request for admission asking the defendant to agree to the fact that a specific document is a contract signed by both parties. If this fact is true, the defendant will admit to it. If it's not, or if there's some doubt on the defendant's part about the document's authenticity, he can deny the admission, or state that he has insufficient facts to support an admission.

A "request for production and inspection" is a form of discovery often used in business disputes. When a request for production and inspection is delivered, the party receiving it is asked to produce any and all books and documents in its possession that are pertinent to the lawsuit, or physical evidence that the party making the request cannot obtain through other means. If the party receiving the request refuses to do so, it must provide its reasons for denying the request. The party making the request can then ask the court to compel the production and inspection of the evidence. However, any request for business documents and other evidence must be fairly specific in stating what exactly is being sought, since otherwise the party making the request could simply go fishing through all of a company's files in search of evidence supporting its case.

Another form of discovery, one which is often used in personal injury cases, is the physical examination of the plaintiff. In cases brought to determine whether or not a person is competent, or to decide the fitness of a parent to have custody, mental and psychological examinations of the parties may also be sought.

Either side in the case may choose to file certain motions with the court. These motions are requests that are made to the court regarding some issue in the case, and asking the court to make a decision. Among the most common types of motions are those that ask the court to allow a plaintiff to amend a complaint, which ask the court to order the opposing party to comply with discovery requests, and which ask the court to dismiss the charges against a particular defendant.

Pretrial conferences may be called in order to allow both parties to discuss the issues in the case. Pretrial conferences are intended to minimize delays in trial proceedings, and in many cases these conferences will lead to an out of court settlement so that a trial will not need to take place at all. However, if a settlement can't be reached before the trial date set by the court, the next step in the litigation process is the trial itself.

Once the case is called to trial, a jury will usually be selected to hear the case, unless the parties have agreed to have the case tried by the judge. We'll say more about juries a little later on.

Each side then gets to make it's opening statement. These statements are summaries of what each party will try to establish during the length of the trial. In some cases, the attorney for the defendant may decide to wait to make his opening statement until later in the proceedings, after the plaintiff has completed presenting his case.

Because the plaintiff has the burden of proof and has to prove its case, the plaintiff gets to go first in presenting his case. That means calling witnesses and presenting evidence in support of the claim made against the defendant. After the plaintiff's attorney finishes questioning a witness (called "direct examination,") the lawyer for the defendant gets the chance to cross-examine the witness, to point up contradictions in the witness' testimony, to show that the witness is unreliable, or to show that the witness has an interest in having the outcome of the case decided in favor of the plaintiff.

After all of the plaintiff's witnesses have been called and all the evidence in support of the plaintiff's case has been presented, the plaintiff "rests his case." At this point, the lawyer for the defendant will ask the court to dismiss the case for lack of proof. If the plaintiff hasn't been able to set out enough evidence to support his claim, a motion to dismiss may be granted. More likely, however, the motion will be denied, and the defendant then gets to present his case. If he's reserved the right to make his opening statement to the jury, this is the time when he'll do so. Otherwise, the defendant begins by calling witnesses and presenting evidence designed to refute the plaintiff's claims.

Just as the defense gets to cross-examine the plaintiff's witnesses, the plaintiff can cross-examine the witnesses testifying on the defendant's behalf. After all of the defense witnesses have been called and the defense rests its case, the plaintiff gets the opportunity to present what's known as "rebuttal evidence." This rebuttal evidence is additional testimony from witnesses or other evidence that explains away some of the defense's case, or which contradicts it outright.

Each side then gets to make a closing statement, which summarizes its arguments and case and asks the court or the jury to provide a favorable judgment. Just as the plaintiff gets the chance to present rebuttal evidence after the defense presents its case, the plaintiff also gets the chance to speak after the defense makes its closing statement, in a final attempt to convince the court to find in the plaintiff's favor.

If a jury trial has been conducted, the jury will then be given instructions by the judge. These instructions include the law that governs the case, the way the jury must apply the law to the facts, and the burden of proof that must be met in order for the plaintiff to win. In most civil cases, the plaintiff must prove its case by a standard known as "a preponderance of the evidence." Basically, this means that the jury must believe that it's more likely than not that the defendant is liable for the damages the plaintiff claims.

The jury is then sent off to a room in the courthouse where it will deliberate until it reaches its decision, or until it becomes clear that the jury is deadlocked and cannot reach a decision. Deadlocked, or "hung' juries don't occur as often in civil cases as they do in criminal trials. Unlike criminal cases, which almost always require the jury to reach a unanimous decision, civil cases can often be decided by a decision of a simple majority of the jurors, or in some cases when two-thirds of them reach agreement.

Once the jury reaches its decision, it returns to the courtroom, where the verdict is announced. At this point, the lawyer for the losing side will almost always ask for what's known as "judgment notwithstanding the verdict." This motion asks the court to disregard the jury verdict and find in favor of the losing side instead. Courts will not grant this motion unless the verdict is clearly outrageous in light of the evidence presented during the trial. In most cases, a final judgment reflecting the jury's decision is entered by the court. At this point, the losing side in the trial must decide whether or not to appeal the trial court ruling.

Generally, an appeal can only be filed when the losing side can make the argument that the court erred in some courtroom procedure or in its interpretation of the law governing the case. The party filing the appeal, called the "appellant" usually can't re-argue the facts of the case to the appeals court. However, in some cases an appeals court can "remand," or return the case to the trial court for further consideration of the facts in light of the appeals court's instructions on how they should be interpreted under the law.

While the steps above provide a general outline of the procedures followed in most civil courts, remember that state court rules and procedures do vary somewhat from place to place. If you are involved in a lawsuit, your attorney can give you more information about the exact procedures that will be followed in the court hearing your case.

You may also be surprised to learn that most trials contain little of the drama associated with the courtroom dramas portrayed in films and plays. In many cases, the lawyers will conduct a lot of business up at the judge's bench, trying to settle procedural issues out of earshot of the jury. And the judge may order the jury out of the courtroom during certain parts of the trial as he attempts to determine whether or not evidence can be admitted for the jury's consideration.

The lawyers won't often have the certainty of a Perry Mason, but then again they don't have the luxury of a script to follow and a director who can yell "Cut" and reshoot the scene when a line is flubbed. Nor will they be likely to wander around the courtroom or approach the witnesses to look them in the eyes and elicit some surprising admission. In most courtrooms, lawyers are required to remain standing at a podium several feet from where the witness sits, and may only approach the witness with the permission of the judge. In general, a lawyer who conducted himself the way most television and film lawyers do would find himself faced with contempt of court charges on a regular basis.

And even after the trial has been completed and while the jury is deliberating, the case may be settled, so the tension and excitement associated with the jury's return to the courtroom may never even be experienced. All in all, real life trials contain little of the electricity most of us are familiar with from their fictional counterparts.
WASHINGTON - The Democratic-led U.S. House of Representatives defied President George W. Bush on Friday and passed an anti-terrorism spy bill that permits lawsuits against phone companies.But the 213-197 vote was far short of the two-thirds majority needed to override a promised veto by Bush. He has demanded that any telecommunication company that participated in his warrantless domestic spying program secretly begun after the Sept. 11 attacks receive retroactive immunity.

The battle over whether to shield companies has been a key reason why the House and Senate have been unable to agree on a bill to replace a law that expired last month that expanded U.S. authority to track enemy targets without a court order.

It has also prompted Republicans to accuse Democrats of undermining national security, while Democrats have accused Bush and his fellow Republicans of election-year fear mongering.

“It is time to reject the scare tactics of the Bush administration and enact this carefully crafted legislation,” said Rep. Jerrold Nadler, a New York Democrat.

White House spokesman Tony Fratto fired back: “Their bill would make it easier for class-action trial lawyers to sue companies whose only offense is that they are alleged to have assisted in efforts to protect the country after the attacks of Sept. 11.”

About 40 lawsuits have accused AT&T Inc., Verizon Communications Inc and Sprint Nextel Corp of violating the privacy rights of law-abiding Americans swept up in the electronic surveillance of phone calls and e-mails. Damages could total in the billions of dollars.

Closed-Door Court

While the House-passed bill would not grant immunity, it would allow phone companies to present their cases in a closed-door court, with the judge given access to confidential documents about the surveillance and the authorization for it.

The bill would revamp the 1978 Foreign Intelligence Surveillance Act to keep up with ever-changing technology, like e-mails, which didn’t exist when the law was written.

The measure would also expand U.S. spy power, but not as much as the administration has demanded. In addition, it would increase congressional and judicial oversight.

Bush has backed a competing bill overwhelmingly approved by the Senate last month that would bolster U.S. electronic surveillance and grant phone companies retroactive immunity.

The House bill was approved shortly before lawmakers left for a two-week recess, leaving behind questions about if and when the House and Senate can agree on a measure to send Bush to sign into law.

House Republican Whip Roy Blunt said, “The security of the country over the next two weeks while we’re gone will not be what it would have been if we would have passed the (Senate) bill today in a bipartisan majority.”

House Democratic leader Steny Hoyer accused the administration of “trying to stampede this Congress into passing the Senate bill. This Congress owes the American people more than blind obeisance to the executive branch.”

Shortly after the Sept. 11 attacks, Bush authorized warrantless surveillance. Critics charged he broke the law, while Bush says he had the war-time power to do it. He later put the program under FISA court supervision. Terms remain secret.

(Additional reporting by Richard Cowan and David Alexander)
The U.S. House of Representatives has voted to permit lawsuits that allege the illicit cooperation of telephone and Internet companies with government spy programs.

By a 227-189 vote largely along party lines on Thursday night, politicians approved the Democrat-backed Restore Act. The action, however, promptly renewed veto vows from the White House, which said the proposal "would dangerously weaken our ability to protect the nation from foreign threats."

Congressional Democrats who endorsed the bill disagreed. "Today's bill helps restore the balance between security and liberty," House Intelligence Committee Chairman Silvestre Reyes, a Texas Democrat, said in a statement after the vote.

The legislation is partially an outgrowth of still-unresolved allegations that U.S. telecommunications companies provided assistance to the National Security Agency's surveillance programs in violation of federal laws since--and possibly even before--the September 11, 2001, terror attacks. The Bush administration has requested that Congress approve legislation granting retroactive legal immunity to any telecommunications company that aided government spying.

Democratic leaders deny that their bill will make it harder to spy on foreign terrorists, but Republican leaders claim that the bill contains enough loopholes to require a warrant for eavesdropping on Osama bin Laden and other foreign terrorists.

"The bill gives terrorists overseas more rights under the law, than individuals inside the U.S.," said Rep. Lamar Smith (R-Texas), a ranking member of the House Judiciary Committee. "That is simply absurd."

Supporters of the House bill say it allows intelligence agents to continue to snoop on foreigners without a warrant and to obtain "basket warrants" for surveilling foreign terrorist organizations.

At the same time, supporters say, the bill will provide additional safeguards for Americans' privacy and more oversight over the shadowy court that's charged with approving eavesdropping requests when one end of the communications belongs to a U.S. person.

The legislation is part of an update to the 1978 Foreign Intelligence Surveillance Act, or FISA, that the Bush administration argues is necessary to make intelligence gathering more efficient amid changing technologies.

Now focus will shift to the Senate, where a new battle over the immunity issue is likely to heat up soon.

The House vote arrived just hours after the Senate Judiciary Committee approved its own spy law rewrite but punted on the issue of whether to approve retroactive immunity for companies with access to electronic communications.

The Senate Intelligence Committee has already approved a different version of that legislation, containing a sweeping provision that would crush all pending lawsuits alleging illegal spying by companies like AT&T and Verizon Communications, as well as any future suits or state utility commission investigations.

The White House has already made it clear it vastly prefers the Senate Intelligence Committee version, but critics say that one gives the executive branch too much unchecked authority to eavesdrop, without a court order, on communications between Americans and people "reasonably believed to be outside the United States."

Both the Senate and House are attempting to craft a more permanent replacement to a Bush administration-backed temporary law called the Protect America Act, which hurriedly passed in Congress in August with what civil-liberties advocates and most Democrats said were insufficient privacy safeguards for Americans. Set to expire in early February, it currently immunizes companies that have cooperated with any government wiretapping regimes since the law was passed.

The existing law, however, does not grant immunity to companies that may have cooperated in the past. The Bush administration has been threatening to veto any bill that does not contain that retroactive protection.

Rep. John Conyers (D-Mich.), one of the Restore Act's authors, said the politicians "cannot even begin to consider this request" until they receive administration documents, which they say they requested 10 months ago, describing the telephone companies' activities in more depth.

May 20th, 2008


1. What is a primary goal of my OS?

* Is it a standard (low end) desktop system? User is dummy, highest priority for hardware and software compatibility.

* Is it a high-end desktop system? User is CAD/CAM engineer, highest priority for performance and certain hardware/software compatibility.

* Is it a real-time oriented system? User is a professional programmer, highest priority for performance, defined response time, easy extendable hardware support and programming control.

2. What platforms my OS is going to support?

* Will it support multiprocessing?

* What kind of multiprocessor platforms? Symmetric? (all processors are exactly the same). Asymmetric? (CPUs may be different in architecture and computing power). Both?

* Will it support only local multiprocessing? (all CPUs are connected through a local bus). Distributed multiprocessing? (CPUs are connected through network-like connection). Both?

* What is the target hardware system? Desktop? (more or less standard hardware set). Customizable (embedded) hardware? (If the latter is an answer you'll likely have to individually support every even compatible processor).

3. Will it be a multitasking OS?

* What kind of multitasking will it provide for applications? Cooperative? (tasks yield CPU when they don't need it, demonstrating good will). Preemptive? (tasks are given a defined amount of CPU time).

* Do I need to protect tasks from each other well?

* What is a relationship between tasks in terms of living space? Do they share the same address space? Completely separated? Both?

* How will different tasks communicate with each other?

* What will be a memory model of space that a task runs in? Should I favor simplicity and speed (memory is cheap) or size (memory is a scarce resource)?

* Do I need to protect system from application tasks?

4. What file system will my OS use?

* Should I favor access time (performance) or reduced storage space (size)?

* Can I use one of already developed and well documented file systems?

* Can I use a cut down version of one of well-known file systems?

* What will be an executable format?

5. What build tools do I need?

* Can I use one of existent compilers and linkers?

* Can I obtain (for free, buy or lease) source code for compilers and linkers?

* Do I have to write my own several tools?

* Do I have to write all tools on my own?This should be by any means avoided.

6. How can I easily support third party soft?

* Can I support already existent and popular software?

* How can I support easy creating of third party applications for my OS? (Libraries)

* How can I support easy creating of third party device drivers?

7. How can I use already written code and information?

* Can I use code that is written by others and works? (Even partially).

* Where can I get different kinds of information? (Set your own information library).
"Future OS, Page 3/3"
4 The Network is the Computer

In these days, networking has become an essential part of every computer system, be it a standalone PC, a file server or a mobile phone. Well, you are right, the dishwasher has no network connection... yet. So the new OS certainly needs to be network enabled. That does not mean that there is no room for improvement, however. We have the Static, DHCP and Zeroconf methods of getting IP adresses, NFS and SMB to share files, Cups, LPR and SMB to share printers, NIS, NIS+ and LDAP to have the same user accounts everywhere, and Remote Desktop, VNC and X for remote logins.

These existing systems work. Sometimes. After editing a lot of settings and configuration files. And that is not how it should be. When networking should be practical, it should be really practical, for everyone. After all, a home user wanting to take advantage of the network they have made for internet sharing, does not want to dive into the world of TCP/IP, DHCP servers, gateways, DNS and so on. They want a network that just works. And what would be rather practical, is if you were able to edit the same document no matter whether you are working on the desk computer, the laptop or the refrigator.

Such a thing cannot be accomplished easily. Rendezvous is a step in the right direction, but it is still bound to one single computer: you don't instantly have access to your documents - you need to search through other computers for the resource you need and login to that computer before you have access.

4.1 The basic idea

This is a rather interesting question. Imagine you have a home network with two computers. On the one hand, you want to be able to login to both of them, even when the other one is down. On the other hand, you don't want that a hacker can enter the network with his laptop and have access to everything. And in a larger network, you don't want each PC to store all user data, as such a network probably does have a server running 24/7.

That does already imply that there would be two "modes" of operation: one for the home user, where each PC knows all accounts, and another one for centralized networks where a server knows them. In a perfect world, these two could be matched, so let's look how that can be done.

4.2 Peer-to-peer and server-client implementation

In principle, each PC operates in decentralized mode. Without a network, that means that it has one user (with associated ID) that owns everything. If two such computers meet eachother, both will learn the user data from eachother. Now, you can login to both computers with exactly the same result.

In a larger network, a server can be added. In a similar p2p-method as with decentralized mode, the server information is shared (but only its address, not the accounts themselves). When someone wants to login now, first the local user database is checked and when there is no match, the computer will also look at the server. The latter will send the account information to the local PC, and if everything is right you will get logged in and have access to the network, most likely the printers and drive space attached to the server. Additionally, the account now exists on your local PC too, so that you can use it even when you aren't connected to the network.

4.3 Account modification

The only problem left is changing your password, as the new password needs to be propagated through the network without allowing hackers to change your password. Luckily, for this there is a solution, too, and it is rather easy. The new password will have the old one "within itself", so that the new password can identify itself. In this way, no hacker can change your password without knowing the current one, while you can do it. To solve the problem for when two password changes meet, the date of each password can be stored in the account. This also makes it possible to remove obsolete passwords after a certain amount of time.

5. The end result

Finally, it might be useful to look at the results of the proposal: is it innovative, and almost more important, is it useful and user-friendly?

I believe the proposed GUI does indeed break with the current tradition and does this in a useful way. Doing away the windowed interface seems going back, but removes something which is rather confusing for new computer users (and has no advantage over split-screen like windowing other than wasting space because windows don't fit to eachother). Not having a too fancy interface is also a good thing, as it doesn't distract you from your work and does not scare away people (yes, people fear Windows XP as it is different from 98/Me).

The document format, on the other hand, does not offer much more than Display PDF or something like that. Combined with the linking model, however, it becomes more powerful than what we have today, allowing to use pipes, famous within the Unix world, within a graphical environment, which serverely extends possibilities and reduces complexity.

The network model, finally, unifies the traditional, server-based systems like UNIX and Netware, and the peer-to-peer networks like AppleShare and SMB in one package, allowing for one consistent, interface for both types of networks, still powerful but also comprehensible for the average home user.

Though this proposal might never see a working implementation, I still believe it shows there is a lot of room for innovation in the current operating systems. So I hope that they will not only innovate behind the scenes (SMP support, NTPL, WinFS, ...) but that one of them will take the step to break with the past to allow new concepts in, so that the end user will finally get improvements as well.
posted by Daan Goedkoop on Wed 7th Apr 2004 08:29 UTC
IconHow will the future operating systems look like? How the user interface, the inner workings, the security policies and the networking will interact? In any case, innovation is the key.

If you visit OSNews once in a while, you will of course know everything about the present and about the future of operating systems. Somewhere between 2005 and 2007, Microsoft will release Windows codename Longhorn, and until that happens Gnome and KDE need to fill the gap between themselves and Windows XP. And if everything goes well, they will implement some Longhorn features as well. On the other side, we have the innovative Mac OS X. It is the user-friendliest computer system on earth, built on UNIX and has OpenGL acceleration of the screen.

Wait. Read that again, and think for yourself: how much innovation has there been and will there be? Let's start with Gnome and KDE. They are mainly copying the user interface of Windows. Yes, Gnome places the application menu on the top of the screen instead of the bottom, and KDE has invented KIO. But almost everything else is plain copying. KDE even has the window buttons in exactly the same place as Windows. There is a reason for this. A quite simple one, actually. Most people today work with Windows, and when they make a desktop environment that behaves radically different, they are afraid they scare people so that they continue to use Windows.

But how is Windows doing? Is Windows innovative? This page says Windows is innovating, and says Windows is to the Macos what Java is to C++. That's not entirely true: C++ was a bad fix to C, and Java cleaned everything up. On the other hand. MacOS was a clean, new implementation of a graphical OS while Windows was just a way to fix DOS. From that, we can say that Windows is to MacOS as if C++ had been invented as a reaction to Java. And when we look a bit closer: what things has Microsoft invented. They copied the overlapping windows. The Explorer is a copy of the Finder, while SMB is a copy of AppleTalk. Word was a reaction to WP and Internet Explorer is just an improved version of NSCA Mosaic. And there is a reason Windows does not really innovate: it doesn't want to lose its market share, so it takes care not to scare users. When the Windows interface would radically change, they could switch to Linux just as well as upgrading to the new Windows version.

You might have noticed that Windows stole quite some things from Apple. So, are they innovating? In 1984, they were. The Macintosh was a nice new computer; one of the first (if not the first) home computer that was not character based anymore and had the mouse as a mandatory input device. Shortly thereafter, they invented AppleTalk, with which networking computers became as easy as plugging in the network cable. After that, only minor system updates have come out until Mac OS X was released. It was called innovative. But what does it do? It's effectively a MacOS-like GUI with a UNIX-core, so in fact it does nothing more than combining two technologies, both being decades old. That has a reason, too: Apple's marketshare is small, and in this way they can keep their former customers while they can also attract new ones: their OS is now built on the "proven reliability" of UNIX thanks to it being 30 years old. Apparently, they have not read the Unix-Haters Handbook, from which it seems UNIX was rather unstable even 10 years ago.

Does that mean the current operating systems are the best; that better is simply impossible? Most likely not, the most logical reason for the lack of innovation is the fear to loose market share by inventing something better, er, different. So here is my proposal: if you build an entirely new operating system, why not make it different from the ones that exist, so that it can try out ideas that might be better than the current ones, and it might even attract users, namely those who want a different operating system for a change, one with an identity. In the rest of this article, I will lay out such a proposal. I'll need to see whether I have time to work on an actual implementation, but thanks to the nature it luckily isn't necessary to start with the bootloader :-)

1 Virtual machine

Nowadays new processors are being invented: the Itanium and the AMD-64. To take advantage of these processors, the operating system and all applications that run on it at least need to be recompiled and parts of them need to be rewritten. That is not very practical, something Sun realized when it invented Java. Microsoft has also seen this and started on the .NET project. Both these implement a Virtual Machine that can run binaries specially adapted to it. The advantage is that the same binaries can always run on the virtual machine, no matter what the host OS or the hardware is.

As this is very practical, I will take such a virtual machine (VM for short) as the basis of the OS idea. Not very innovative, I know, but rather practical. It makes the OS and it's applications completely hardware-indepent and also has the advantage that the VM can first be implemented as running on another OS, so that work can immediately start on the VM and OS itself, without needing to code a boot loader and extended hardware support first.

2 The user interface

The user interface should be friendly and practical, both for the newbie as for the experienced computer user. Therefore, no POSIX compatibility is needed and no GNU utilities need to be ported. And why should they? In this modern world, we want to use more than text. We want fonts, webpages, flash animations, music, pictures and movies. The command line is not suitable for them, so a graphical interface (GI) is really necessary.

2.1 The general layout

However, this does not mean copying the GUIs of Windows or MacOS. They can namely be rather confusing. For example, most GUI's has overlapping windows, which are confusingThe Xerox Star people already knew this and therefore didn't allow windows to overlap. The confusing thing is the following: imagine you have two windows, say a maximized Outlook Express and a normal New Message window on top. When you accidentally click the Outlook Express window, it will look like the message you were typing is lost. Of course, it's just hidden behind the window you just clicked, but that is not obvious. . The solution is to take the idea of the original MacOS even further: not only hide other applications when you activate one, but make all windows maximized instead. That solves the overlapping window problem and does away with the title bar taking precious screen space.

Now you will probably notice that drag and drop is not possible anymore, at least not between applications and also not between windows. That is not practical, because it forms a much more visual way of moving objects than the copy-past way Windows introduced. Therefore, the GI should offer a split-screen mode, in which two windows, can be visible next to eachoter.

May 18th, 2008

On Medved...

Michael Medved is pathetic. He is nothing more than an RNC cheerleader slash puppet, cheering on the 'straight-talk express'....yeah, whatever. You will be praising John McCain all the way to his grave (or his wing of the hospital).


I posted a bunch of stuff via mobile but for some reason it didn't make it..........oh well

May 17th, 2008

Brutal Honesty

Only by being brutally honest with ourselves (not everyone else!) can we ever hope to succeed and live up to our potential. everyone else is judging us harshly, so we should judge ourselves by the same measure, not with kid gloves

The Conservative Party

an idea whose time has come....

Your Ugly Wife, Chapter 1

Hey, guess what, pal? Your wife is ugly. No one bothered to tell you, but, she screwed three of your friends before you married her. Yeah, she was a big slut in high school too. By the way, everybody laughs at you behind your back. Not to mention the fact that your kids are little brats with no friends.

The lies we tell ourselves are the truths that ultimately destroy us. Those who know the truth, about themselves, and the world around them, are truly free. Those who do not (which comprise the majority of the world population) live in a perpetual lie, perpetuated by their every pathetic breath.

"Your Ugly Wife"

title of a possible book i might work on, about the lies we as americans tell ourselves to make it through crummy, pointless, lives...

May 11th, 2008

How to get paid and get fit with my “wholesaling exercise plan”
December 10th, 2007

While I love sitting on the couch and watching TV with my wife and kids, I know one important rule–sitting on the couch watching TV has never gotten me any wholesale deal.

So if you want deals, it’s time for you to get up and get into some money-making action by going out and finding motivated sellers! By doing so, you’ll not only make great profits, you’ll also gain the benefit of helping America! True! All the experts say Americans are getting too fat through lack of exercise, so you can get fit by getting out and contacting the ideal sellers described below and making money at the same time. As Batman would say, “Holy sweet deal!”

Need proof that my “wholesaling exercise plan” works? Heck, have you ever seen a picture of me? I’m so skinny that if I turn sideways, nobody can see me (which is real handy when I’m in trouble with my wife!) :)

So, get up off your duff and get some great and profitable exercise by finding and contacting the following motivated sellers. Your wholesale fee is waiting for you!
Sellers Who Inherited a Property

These sellers are ideal for three reasons:

* Often they don’t live in the inherited property, so they don’t feel any emotional attachment to it.
* Such wholesale properties often require a lot of work in terms of repair and maintenance; the sellers may simply not have the money required to do the fix-up.
* The house may be owned free and clear (or with a small mortgage), so the sellers are willing to sell at a deeper discount because they don’t really have any money in the property. Any money received they consider extra money.

One disadvantage of this market—sellers from big families! It can be difficult to track all the siblings down (different cities and states) and get them to agree to a deal. You know families–brothers and sisters don’t always agree with each other on what to do with a property. If you run into such a situation, use this wholesale strategy–deal with each person on a one-on-one basis and tell them that you’ll work with their siblings separately. That way, you can keep emotional fireworks out of the equation and work with each person on a rational basis.
Out-of-State Sellers

These sellers can be one of several types:

* Inheritors.
* Re-locators (job changes, military service, etc.)
* A person who bought a house as rental property and then found out that he or she can’t manage it, etc.

Such properties become vacant or aren’t well-maintained and need a lot of work. Out-of-state sellers can be hard to track down, but there’s one big advantage to this situation–they may well not understand the local market well so it’s easier to get them to agree to a discount.
Older Sellers

Often, older sellers move in with their children or into nursing homes. Their children may not want to occupy the house for various reasons—too many repairs required; they live out of state, etc. Older sellers are ideal because, in many cases, they just want to sell the house as-is and get their money out of it, so they’re more willing to give you a deep discount.
Tired Landlords

These individuals tend to fall into three categories and are ideal sellers in the wholesale market:

* Retiring landlords
* Burned-out landlords
* Amateur landlords

The second category has had to deal with bad tenants who’ve trashed the property and/or failed to pay rent. So, they’ve had to go through the tedious process of evicting tenants and then finding new ones. In the end, they get burned out by the complications and losing income and simply want to get rid of the property.

The amateur landlords have watched one too many infomercials about an easy way to make millions. They end up with headaches never mentioned in the infomercials — bad tenants, ignorance of state and federal laws, tax complications, etc. After a while, they simply want out and are willing to deal on a wholesale basis.
Sellers Who’ve Abandoned a Property

Sometimes, sellers simply abandon a property because of delinquent taxes or because it costs too much to repair and maintain. You’ll find these wholesale properties in lower-income neighborhoods.
Sellers With a Lot of Equity

These sellers have owned their homes for a long period of time—10 or more years. These homes may also require a lot of maintenance due to their age. Owners can’t sell these homes to retail buyers. This gives you the opportunity to negotiate on the equity and find a good wholesale deal.

For example, assume sellers have a house that’s worth $100,000 and owe only $50,000 on it. In this case, it’s much easier for you to buy the house at $60,000 than, say, a house that’s valued at $100,000, and the seller owes $80,000 on it. Whenever the seller has more equity, there’s more to negotiate from versus when they don’t have any equity to negotiate from.

In summary, there are several reasons why the sellers described above are an ideal target market for you. They:

* Don’t want the property.
* Can’t market the property.
* Don’t have the time, energy, or time to fix up the property.
* Don’t know the true potential of the property.

So, what are you waiting for? Motivated sellers and wholesale fees are out there waiting for you! Get up off the couch and into money-making action!

To Your Success,

Tim Mai

P.S. Go here to get instant access to our list of motivated sellers www.hotbargainproperties.com

Posted in Uncategorized | No Comments »
Wholesaling Power Tools
December 4th, 2007

Do you remember the comedy TV series “Home Improvement,” starring the comedian, Tim Allen? In that show, Tim “The Tool Man” Taylor couldn’t resist adding “more power” to every appliance, tool and gadget he could find. Of course, most times, he ended up blowing up or destroying everything he worked on!

Well, in this article, I’m going to provide you with a list of wholesaling tools that will give you lots of wholesaling power but definitely won’t blow up on you! They’ll make the job of finding wholesale properties easier. Even better, they’re inexpensive and simple to use.

Here’s a checklist of “must” tools:

* A digital camera—Wholesaling is all about finding vacant and ugly houses on drives through neighborhoods and then taking photos of them so you have a record of their appearance. You don’t need any fancy camera to get this job done. Look for a camera below $100 cost.

* Small, dry-erasable board—I like efficiency so I use a board when looking at wholesale properties. I simply write the address on the board, raise it up and take a picture of the house so I have the address and property on one photo. Then, I erase the board and move on to the next wholesale property. Of course, that’s my way! Use a system that works best for you.

* Flashlight—Did you know many wholesale properties are vacant, boarded up, or have the power turned off? Unless you’re a cat, you can’t see well in the dark so always, always have a flashlight with you so you can check out the true condition of the wholesale property.

* Phone/fax line—This hook-up is extremely important to your wholesaling business. Why? Because you’ll be faxing contracts to out-of-town title companies and sellers, and this is the fastest way to do it. In the wholesaling business, there’s definitely a “need for speed!”

* Business cards—Always a must since people rely on them to get in touch with you! However, remember this—your cards should be interesting enough to catch people’s interest and make them remember you as a wholesaler. In another blog, I’ll tell you about the card design I use that makes it stand out from all other cards.—Note: Tim, you’ll have to let me know if you want to do a blog articles on this subject.

* Computer and Internet access—The Digital Age has revolutionized many things, including the wholesaling of properties. An Internet connection will give you access to my membership website and many other resources. Be sure to get a DSL or cable connection. No dial-up connections! Talk about sloooooooooooooooooooooooow!

* Skip-trace account—As a wholesale investor, you’ll need to locate the owners of vacant or ugly properties who may live in another town or another state. This is no longer a “pain in the posterior” because of skip-tracing Internet services like Accurint http://www.accurint.com/ (my choice) or MerlinData http://www.merlindata.com/).

* Microsoft Word and Access—You don’t have to use Word, but it’s the business word processing standard and it allows you to create marketing letters easily and cheaply. Its companion software, Excel, has a “mail merge” function that makes it easy to execute a wholesaling direct mail campaign.

* Multi-function printer/copy/fax machine—There’s one thing you’ll be doing all the time as a wholesale investor–printing, copying and sending contracts! So, get a good, sturdy machine, but one that’s easy on expensive ink.

* Filing cabinets, file folders, tickler system—As a wholesale investor, you must stay organized and have information at your fingertips. I recommend the “tickler” system. This is an organizational method in which 12 folders represent months and 31 folders represent each day. The system “tickles” your memory at the start of each day, so know exactly what you need to do and the people you need to contact in terms of wholesale deals.

* Wholesale Deal Websites - HotBargainProperties.com & 3DayBid.com - You gotta use these websites to find deals as well as wholesale your deals on there.

So, what are you waiting for? Get your wholesaling tools and get to work making money!

If you’re not yet a member of HotBargainProperties.com, hurry here now for your 30-day free trial www.hotbargainproperties.com

To Your Success,

Tim Mai

May 8th, 2008

Voice Post

483K 2:30
(no transcription available)

Voice Post

699K 3:33
(no transcription available)

Voice Post

582K 2:59
“Alright anyway my last little thing was little bit fuzzy. So I just wanna reiterate what I've been saying as the last I said like why you gotta listen to a guy like Vic Moris you know. I'm not asking for money I'm not trying to push your book you know. I don't have big bias towards Hillary which he does have so I don't think he's really qualified to tell you oh this is gonna happen that's gonna happen. Dude I hate Hillary. What I mean I don't hate Hillary or the bomb bomb Mary but Amber probably looking at at a conservative but I still like I just gonna feel like yo I prefer I'll prefer the democrats to this one because I feel like what is the point of you know why should I vote for Mccain and why should any republican support Mccain when you know I mean I don't know. That's just what I think I mean probably the man of opinion. But he's not a republican much less a conservative. You know his a rhino or whatever and I'm just not gonna support him so I'd rather see the democrats win and let the republicans kind of keep their shit together. You know and also this is one more time if you democrats back Obama you are not going to win in November. I just wanna make that clear and I wanna put that off the record so that nobody can say that I never said that. So the only people the only people to blame for your loss in November and you will lose to Mccain if you put up Obama against him as your sub. So if you wanna do that he should go into this knowing for well again no chance in November. They're gonna bring up gentleman right. They're gonna bring you know the Pennsylvania I mean the stuff is stock to him you cannot unspeck(?) it. Then none of this is fully over. Nobody has forgotten about this I'm telling you. You think white voters like me cross they're cross over all kinds of people that you had are gonna come out for him in November? Hell no no I'm not. I'm not gonna back him. It's not gonna happen. I just don't wanna you know. Still don't get it. So don't cry and mourn and say oh we didn't know the guys are gonna win. You won't fall if he lose but if you put up Hillary Clinton you got a good chance of winning. I don't nope I tell you got a 60 40 chance of beating Mccain. So it's up to you but you know. You know this is the time that you listen to me not Vic Moris. Vic Moris don't know what the hell he's talking about. So make up your minds and it's the last time I'm gonna warn you about Obama. No chance of winning. ___.”

Auto-Transcribed Voice Post - spoken through SpinVox
Nature of the Industry [About this section] Back to TopBack to Top

Goods and services. All organizations today rely on computer and information technology to conduct business and operate more efficiently. Computer software is needed to run and protect computer systems and networks. Software publishing establishments are involved in all aspects of producing and distributing computer software, such as designing, providing documentation, assisting in installation, and providing support services to customers. The term “publishing” often implies the production and distribution of information in printed form. The software publishing industry also produces and distributes information, but usually it does so by other methods, such as CD-ROMs, the sale of new computers already preloaded with software, or through distribution over the Internet. Establishments in this industry may design, develop, and publish software, or publish only. Establishments that provide access to software for clients from a central host site, design custom software to meet the needs of specific users, or are involved in the mass duplication of software are classified elsewhere. (For more information, see the section on computer systems design and related services found elsewhere in the Career Guide.)

Industry organization. Software is often divided into two main categories—applications software and systems software. Applications software includes individual programs for computer users—such as word processing and spreadsheet packages, games and graphics packages, data storage programs, and Web browsing programs. Systems software, on the other hand, includes operating systems and all of the related programs that enable computers to function. Establishments that design and publish prepackaged software may specialize in one of these areas, or may be involved in both. Some establishments also may install software on a customer’s system and provide user support. In 2006, there were approximately 10,000 establishments that were engaged primarily in computer software publishing, or in publishing and reproduction.

Recent developments. The Internet has vastly altered the complexion of the software industry over the last decade. Much of the applications and system software that is now developed is intended for use on the Internet, and for connections to the Internet.

Organizations are constantly seeking to implement technologies that will improve efficiency. Enterprise resource planning (ERP) software is such an example. ERP, which is typically implemented by large organizations with vast computer networks, consists of cross-industry applications that automate a firm’s business processes. Common ERP applications include human resources, manufacturing, and financial management software. Recently developed ERP applications also manage a firm’s customer relations and supply-chain.

Electronic business (e-business) is any process that a business organization conducts over a computer network. Electronic commerce (e-commerce) is the part of e-business that involves the buying and selling of goods and services. With the growth of the Internet and the expansion of e-commerce, there is significant demand for e-commerce software that enables businesses to become as efficient as possible.

This widespread use of the Internet and intranets also has led to greater focus on the need for computer security. Security threats range from damaging computer viruses to online credit card fraud. The robust growth of e-commerce increases this concern, as firms use the internet to exchange sensitive information with an increasing number of clients. As a result, organizations and individual computer users are demanding software, such as firewalls and antivirus software, that secures their computer networks or individual computer environments.

Working Conditions [About this section] Back to TopBack to Top

Hours. In 2006, workers in the software publishing industry averaged 37.6 hours per week, compared with 33.9 for all industries combined. Many workers in this industry worked more than the standard 40-hour workweek—about 26 percent worked 50 or more. For some professionals, evening or weekend work may be necessary to meet deadlines or solve problems. Professionals working for large establishments may have less freedom in planning their schedule than do consultants for very small firms, whose work may be more varied. Only about 3 percent of the workers in the software publishing industry worked part time, compared with 15 percent of workers throughout all industries.

Work environment. Most workers in this industry work in clean, quiet offices. Given the technology available today, however, more work can be done from remote locations using fax machines, e-mail, and especially the Internet. Employees who work at video terminals for extended periods may experience musculoskeletal strain, eye problems, stress, or repetitive motion illnesses, such as carpal tunnel syndrome.

Employment [About this section] Back to TopBack to Top

In 2006, there were about 243,000 wage and salary jobs in the software publishing industry. While the industry has both large and small firms, the average establishment in software publishing is relatively small; more than half of the establishments employed fewer than 5 workers. Many of these small establishments are startup firms that hope to capitalize on a market niche. About 76 percent of jobs, however, are found in establishments that employ 50 or more workers (chart 1).

Sixty percent of the establishments in software publishing employ fewer than 5 workers, but a few large establishments employ almost half of all workers.

Relative to the rest of the economy, there are significantly fewer workers 45 years of age and older in software publishing establishments. This industry’s workforce remains younger than most, with large proportions of workers in the 25-to-44 age range (table 1). This reflects the industry’s explosive growth in employment in the 1980s and 1990s, which afforded opportunities to thousands of young workers who possessed the latest technical skills.

Table 1. Percent distribution of employment, by age group, 2006 Age group Software publishers All industries

100.0% 100.0%

0.7 4.3

4.4 9.6

28.7 21.5

36.8 23.9

19.1 23.6

8.1 13.4

65 and older
2.2 3.7

Occupations in the Industry [About this section] Back to TopBack to Top

Providing a wide array of information services to clients requires a diverse and well-educated workforce. The majority of workers in the software publishing industry are professional and related workers, such as computer software engineers and computer programmers (table 2). This major occupational group accounts for about 61 percent of the jobs in the industry, reflecting the emphasis on high-level technical skills and creativity. By 2016, the employment share of professional and related occupations is expected to be even greater, while the employment share of office and administrative support jobs, currently accounting for about 11 percent of industry employment, is projected to fall.

Professional and related occupations. Computer specialists make up the vast majority of professional and related occupations among software publishers, and account for about 52 percent of the industry as a whole. Their duties vary substantially, and include such tasks as developing software applications, designing information networks, and assisting computer users.

Programmers write, test, and maintain the detailed instructions, called programs or software, that computers must follow to perform their functions. These programs tell the computer what to do—which information to identify and access, how to process it, and what equipment to use. Programmers write these commands by breaking down each operation into a logical sequence of steps, and converting the instructions for those steps into a language that the computer understands. While some still work with traditional programming languages like COBOL, most programmers today work with more sophisticated tools. Object-oriented programming languages, such as C++ and Java, computer-aided software engineering (CASE) tools, and artificial intelligence tools are now widely used to create and maintain programs. These languages and tools allow portions of code to be reused in programs that require similar routines. Many programmers also customize purchased software or create better software to meet a client’s specific needs.

Computer software engineers design, develop, test, and evaluate software programs and systems. Although programmers write and support programs in new languages, much of the design and development now is the responsibility of software engineers or software developers. Software engineers must possess strong programming skills, but are more concerned with developing algorithms and analyzing and solving programming problems than with actually writing code. These professionals develop many types of software, including operating systems software, network distribution software, and a variety of applications software. Computer systems software engineers coordinate the construction and maintenance of a company's computer systems, and plan their future growth. They develop software systems for control and automation in manufacturing, business, and other areas. They research, design, and test operating system software, compilers—software that converts programs for faster processing—and network distribution software. Computer applications software engineers analyze users' needs and design, create, and modify general computer applications software or specialized utility programs. For example, video game programmers are software engineers who plan and write video game software.

Computer support specialists provide technical assistance, support, and advice to customers and users. This group of occupations includes workers with a variety of titles, such as technical support specialists and help-desk technicians. These troubleshooters interpret problems and provide technical support for software and systems. They answer telephone calls, analyze problems using automated diagnostic programs, and resolve difficulties encountered by users. Support specialists may work either within a company or other organization that uses computer software, or directly for a computer software vendor.

Other computer specialists include a wide range of professionals who specialize in operation, analysis, education, application, or design for a particular piece of the system. Many are involved in the design, testing, and evaluation of network systems such as local area networks (LAN), wide area networks (WAN), the Internet, and other data communications systems. Specialty occupations reflect an emphasis on client-server applications and end-user support; however, occupational titles shift rapidly to reflect new developments in technology.

Sales and related occupations. A growing number of marketing and sales workers also are employed in this industry. In order to compete successfully in the online world, the presentation and features of software and other content related to information technology becomes increasingly important. For example, publishers of software that provides connections to the Internet must be able to differentiate their products from those of their competitors. Marketing and sales workers are responsible for promoting and selling the products and services produced by the industry.

Table 2. Employment of wage and salary workers in software publishers by occupation, 2006 and projected change, 2006-2016.
(Employment in thousands) Occupation Employment, 2006 Percent

Training and Advancement [About this section] Back to TopBack to Top

Occupations in the software publishing industry require varying levels of education, but in 2006, more than 8 in 10 workers held college degrees. The level of education and type of training required depend on the employer’s needs, which often are affected by such things as local demand for workers, project timelines, and changes in technology and business conditions.

Professional and related occupations. Although there are no universal educational requirements for computer programmers, workers in this occupation commonly hold a bachelor’s degree. Some hold a degree in computer science, mathematics, or information systems. Others have taken special courses in computer programming to supplement their study in fields such as accounting, inventory control, or other areas of business. Because employers’ needs are varied, a 2-year degree or certificate may be sufficient for some positions so long as applicants possess the right technical skills. In addition, some employers seek applicants with technical or professional certification. Certification can be obtained independently through a number of organizations, although many vendors now assist employees in becoming certified.

Entry-level computer programmers usually start working with an experienced programmer to update existing code, generate lines of one portion of a larger program, or write relatively simple programs. They then advance to more difficult programming assignments, and may become project supervisors. With continued experience, they may move into management positions within their organizations. Many programmers who work closely with systems analysts advance to systems analyst positions.

Most computer software engineers have at least a bachelor’s degree, in addition to broad knowledge and experience with computer systems and technologies. Common degree concentrations for applications software engineers include computer science and software engineering, and common degree concentrations for systems software engineers include computer science and computer information systems. Graduate degrees are preferred for some of the more complex software engineering jobs. Some employers also are seeking workers with additional knowledge and experience. For example, a computer software engineer interested in developing e-commerce applications should have some expertise in sales or finance. In addition, some employers are seeking applicants with technical or professional certification.

Computer software engineers who show leadership ability can become project managers or advance into management positions, such as manager of information systems or even chief information officer.

Persons interested in becoming a computer support specialist generally need only an associate’s degree in a computer-related field, as well as significant hands-on experience with computers. They also must possess strong problem-solving, analytical, and communication skills, because troubleshooting and helping others are their main job functions. As technology continues to improve, computer support specialists must constantly strive to stay up to date and acquire new skills if they wish to remain in the field. One way to achieve this is through technical or professional certification.

Computer support specialists who develop expertise in a particular program or type of software can advance to a position as a programmer or software engineer.

Sales and related occupations. Many marketing and sales workers are able to secure entry-level jobs with little technical experience, and acquire knowledge of their company’s products and services through on-the-job training. Computer specialists also have opportunities to move into sales positions as they gain knowledge of specific products and services. Computer programmers who write accounting software, for example, may use their specialized knowledge to sell such products to similar firms. Also, computer support specialists providing technical support for an operating system may eventually market that product, based on their experience and knowledge of the system.

Outlook [About this section] Back to TopBack to Top

Employment in the software publishing industry has more than doubled since 1990. As firms continue to invest heavily in information technology, and as the demand for specialized software rises, employment in software publishing is projected to increase by 32 percent from 2006 to 2016.

Employment change. Wage and salary jobs in software publishing are expected to increase by 32 percent between 2006 and 2016, nearly three times as fast as the 11 percent growth projected for all industries combined. Growth will not be as rapid as it was during the technology boom of the 1990s, however, as the software industry begins to mature and as routine work is increasingly outsourced to workers in other countries.

Demand for software publishing services will grow as a result of an increasing reliance on information technology, combined with falling prices of computers and related hardware. Individuals and organizations will continue to invest in applications and systems software to maximize the return on their investments in equipment, and to fulfill their growing computing needs. Also, such investments usually continue even during economic downturns, because improved software boosts productivity, increases efficiency, and, in some cases, reduces the need for workers.

The growing reliance on the Internet will be a major driver of job growth. The way the Internet is used is constantly changing, and so is the software required to run the new and emerging computer applications. Electronic commerce, for example, has changed the way companies transact business. E-commerce is automating many steps in the transaction of business between companies, allowing firms to operate more efficiently. Businesses also are moving their supply networks online and developing online marketplaces. The sustained growth of electronic commerce, as well as the growing uses of intranets and extranets, will drive demand for increasingly sophisticated software tools geared towards these technologies. And, as the amount of electronic information stored and accessed continues to grow, new applications and security needs will increase demand for database software.

The proliferation of “mobile” technologies also has created demand for a wide variety of new products and services. For example, the expansion of the wireless Internet, known as WiFi, brings a new aspect of mobility to information technology by allowing people to stay connected to the Internet anywhere, anytime. As businesses and individuals become more dependent on this new technology, there will be an increased need for new software applications in order to maximize the potential of wireless products.

Another significant factor contributing to growth in software is computer security. Organizations invest heavily in software to protect their information and secure their systems from attack. And, as more individuals and organizations are conducting business electronically, the importance of maintaining computer system and network security will increase, leading to greater demand for security software.

Given the increasingly widespread use of information technology and the overall rate of growth expected for the industry, most occupations should grow very rapidly, although some faster than others. The most rapid job growth will occur among computer specialists—especially computer software engineers—as organizations continue to rely on software to maximize the return on their investments in equipment, and as individuals continue to use new and increasing amounts of software applications. Employment of computer programmers should continue to expand, but more slowly than that of other occupations, as more routine programming functions are automated, and as more programming services are outsourced offshore.

Job prospects. Job opportunities in software publishing should be excellent for most workers, given the rate at which the industry is expected to grow, and the increasing integration and application of software in all sectors of the economy. Professional workers should enjoy the best opportunities, reflecting employers’ continuing demand for higher level skills to keep up with changes in technology. In addition, as individuals and organizations continue to conduct business electronically, the importance of maintaining system and network security will increase. Employment opportunities should be excellent for individuals involved in the development of security software

Earnings [About this section] Back to TopBack to Top

Industry earnings. Employees in the software publishing industry generally command higher earnings than the national average. All production or nonsupervisory workers in the industry averaged $1,444 a week in 2006, significantly higher than the average of $568 for all industries. This reflects the concentration of professionals and specialists who often are highly compensated for their skills or expertise. Given the pace at which technology advances in this industry, earnings can be driven by demand for specific skills or experience. Earnings in the occupations with the largest employment in software publishing appear in table 3.

Table 3. Median hourly earnings of the largest occupations in software publishers, May 2006 Occupation Software publishers All industries

General and operations managers
$61.09 $40.97

Computer and information systems managers
54.26 48.84

Market research analysts
43.08 28.28

Computer software engineers, systems software
42.04 41.04

Computer software engineers, applications
40.66 38.36

Computer programmers
38.11 31.50

Computer systems analysts
35.45 33.54

Sales representatives, wholesale and manufacturing, technical and scientific products
34.39 30.98

Network and computer systems administrators
33.05 29.87

Computer support specialists
22.24 19.94

As one might expect, education and experience influence earnings as well. For example, hourly earnings of computer software engineers, applications ranged from less than $25.17 for the lowest 10 percent to more than $59.78 for the highest 10 percent in May 2006. Managers usually earn more because they have been on the job longer and are more experienced than their staffs, but their salaries also can vary by level and experience. For example, hourly earnings of computer and information systems managers ranged from less than $35.30 for the lowest 10 percent to more than $70.00 for the highest 10 percent in May 2006. Earnings also may be affected by size, location, and type of establishment, hours and responsibilities of the employee, and level of sales.

Benefits and union membership. Workers generally receive standard benefits, including health insurance, paid vacation and sick leave, and pension plans. Unionization is rare in the software publishing industry. In 2006, virtually no workers were union members or covered by union contracts, compared with 13 percent of workers throughout private industry.

Sources of Additional Information [About this section] Back to TopBack to Top

Links to non-BLS Internet sites are provided for your convenience and do not constitute an endorsement.

Further information about computer careers is available from:

* Association for Computing Machinery, 2 Penn Plaza, Suite 701, New York, NY 10121-0701. Internet: http://www.acm.org

* National Workforce Center for Emerging Technologies, 3000 Landerholm Circle SE., Bellevue, WA 98007. Internet: http://www.nwcet.org

Information on the certified software development professional program can be found at:

* Institute of Electrical and Electronics Engineers Computer Society, Headquarters Office, 1730 Massachusetts Ave. NW., Washington, DC 20036-1992. Internet: http://www.computer.org/certification

* University of Washington Computer Science and Engineering Department, AC101 Paul G. Allen Center, Box 352350, 185 Stevens Way, Seattle, WA 98195-2350. Internet: http://www.cs.washington.edu/WhyCSE/

Information on the following occupations can be found in the 2008–09 Occupational Outlook Handbook:

* Computer and information systems managers

* Computer programmers

* Computer scientists and database administrators

* Computer software engineers

* Computer support specialists and systems administrators

* Computer systems analysts

NAICS Codes [About the NAICS codes] Back to TopBack to Top

Suggested citation: Bureau of Labor Statistics, U.S. Department of Labor, Career Guide to Industries, 2008-09 Edition, Software Publishers, on the Internet at http://www.bls.gov/oco/cg/cgs051.htm (visited May 08, 2008).

Last Modified Date: March 4, 2008

May 7th, 2008

Voice Post

598K 3:05
(no transcription available)

Limit Orders

Limit Order

An order placed with a brokerage to buy or sell a set number of shares at a specified price or better. Limit orders also allow an investor to limit the length of time an order can be outstanding before being canceled.

Define: Stop Order

Stop Order

An order to buy or sell a security when its price surpasses a particular point, thus ensuring a greater probability of achieving a predetermined entry or exit price, limiting the investor's loss or locking in his or her profit. Once the price surpasses the predefined entry/exit point, the stop order becomes a market order.

Also referred to as a "stop" and/or "stop-loss order".

May 4th, 2008

check this URL out http://www.tvunetworks.com and download the player and broadcast player to watch free international tv and/or broadcast your own tv station to the world. very cool, totally free, no bullshit subscription or anything. i watch it everyday its great. i like the bloomberg financial channel and world international news channels. any channel under 200 bandwith usually won't connect, but once u connect to a channel it streams without interuption.

May 1st, 2008


Thug'd out hood rap music available on itunes radio section...........WildFMRadio.com

the rest of the stations are weak....right now theyre playin that new lil boosie
Best Enty-Level Certifications
Those who wish to walk the certification trail, or climb one or more certification ladders as they tackle increasingly more difficult or demanding subjects, have to start somewhere. All the certifications in this list represent popular places for IT professionals to start. Most are highly regarded and remain widely sought-after. In the interest of brevity, these are listed in no particular order and without much additional supporting detail:

1. Cisco Certified Network Associate (CCNA): See www.cisco.com/go/ccna.
2. Certified Wireless Network Administrator (CWNA): See www.cwne.com/cwna/.
3. Sun Certified Programmer for the Java 2 Platform (SCJP): See suned.sun.com/US/certification/java/java_progj2se.html.
4. Red Hat Certified Technician (RHCT): See www.redhat.com/about/presscenter/2002/press_rhct.html.
5. LPI Level 1 (LPIC1): See www.lpi.org/en/certification.html.
6. SANS GIAC Security Essentials Certification (GSEC): See www.giac.org/subject_certs.php#GSEC.
7. CompTIA A+: See www.comptia.org/certification/a/.
8. CompTIA Security+: See www.comptia.org/certification/security/.
9. CompTIA Network+: See www.comptia.org/certification/network/.
10. Microsoft Certified Professional (MCP): See www.microsoft.com.

April 30th, 2008

Republican Women of Madison
The Republican Women of Madison meet the 3rd Wednesday of each month.
The social is at 11:00 AM and the meeting begins at 11:30 at the Madison Radisson.
For more Information, please contact Sandra Schimmelpfennig (Sandra1AL@knology.net) at 721-9989 or Penny Melton (pennygirl41@bellsouth.net) at 426-3395.

Madison County Republican Men’s Club
The Madison County Republican Men’s Club meets the 3rd Saturday of each month.
The social is at 7:00 AM and breakfast begins at 8:00 AM at the
Trinity United Methodist Church, 607 Airport Road (Enter on the East side of the building)
Breakfast cost is $7. Ladies are welcome.
For More Information, please contact Elbert Peters at (256) 859-3186 or jepeters65@knology.net

Republican Women of Huntsville
The Republican Women of Huntsville meet the 1st Tuesday of each month.
The social is at 11:00 AM and the meeting begins at 11:30 AM at
The Valley Hill Country Club in Southeast Huntsville
For more information contact Frances Taylor at (256) 509-1667 or franceseliz@netscape.net

Tennessee Valley Republican Club
The Tennessee Valley Republican Club meets the 2nd Saturday of each month.
The social hour begins at 8:00 AM and the meeting begins at 9:00 AM at the
Madison Radisson, 8721 Madison Blvd in Madison, AL.
For more information contact Douglas Cook at (256) 772-0783 or dcook530@knology.net

Twickenham Republican Women
The Twickenham Republican Women meet the 3rd Tuesday of each month at the Holiday Inn Select (formally the Huntsville Hilton)
The social time begins at 11:45 AM and the meeting begins at Noon.
For more information please contact Linda Coats 837-8198 or twrc@mgop.org

Madison County Young Republicans
The Madison County Young Republicans meet the 2nd Thursday of each month at
7:00 PM.
For more information please contact Shannon Moore at 551-0670 or smoore551@bellsouth.net

UAH College Republicans
The UAH College Republicans meet the 3rd Thursday of each month at
4:00 PM at the NCRH Phase II.
For more information contact Sarah Fluhler at (256) 653-5916 or fluhler@nsstc.uah.edu
What would a 21st centuy, web 2.o and internet intergrated OS look like?

The name of this thing will Genesis OS or OSg (OS Genesis)

The OS will be the NETWORK and the PLATFORM, fully intergrated with the WWW. This will be the Vista that never was, or all that V claimed it would be (but we really found out its worse than XP)

Vista will never be adopted mainstream or in enterprise b/c it is full of security flaws and can't use 2 or more cores properly.

64 or better yet 128 bit optimized

New File Storage Methods

Instead of Directory, what about a tag cloud structure (example Digg Labs has one)?

So file retrieval becomes a visually rich and colorful experience opposed to Microsofts DIR structure which is essentially the same in Vista with more colors and 64bit.


1. DESKTOP-the desktop should come with a dazzling arrary of digital art and photography preinstalled, with no links "find more online". Users shouldn't have to PAY for a decent wallpaperr

2. Tab Bar at bottom (annoying as hell and nonfunctional half the time).



3. THAT HORRIBLE START BUTTON---instead, there will be several different categories of buttons that do the functions of the once singular 'start button'...........MUSIC.......Movies..........Games..........Documents...............Internet

4. Mozilla FireFox comes preinstalled instead of IE. There will be absolutely ZERO msoft stuff preinstalled

5. OpenOffice.Org or other WP replaces office

6. Top Tab bar will be an address bar so that you can type in a web address whenever you feel like it, whether or not browser is open

7. Instead of using inhouse apps, we'll use all 3rd party stuff that ACTUALLY WORKS AND HAS CUSTOMER BASE

(because most users will download some or all of these anyway, why not save them download time and effort???

List of Desired 3rd Party Apps that will be Preinstalled on all systems with the OS:

iTunes and iPhone
Mozilla FireFox
Avant Browser
AVG antivirus Professional
Ruby Editor SCite
LindenLabs Second Life
TVU Networks Broadcaster/Player
Yahoo IM
CounterStrike and Steam
LJournal Client or offline blog client
CC Cleaner
Spybot: Search and Destroy
AVG Anti-Spyware
Yahoo Mail
Google Mail
Bit Torrent
UltraHal 6.1
Audacity 1.26
Adobe Photoshop
Adobe Reader/Viewer
Pre-loaded Toolbars from popular sites (Yahoo, FaceBook, LinkedIn, Monster.com, Bit Torrent)

Keep in mind, this list is by no means exhaustive, and of course we have to get the individual companies approval to feature their software;
however, if the OS works well and looks good, I think we could get 90 percent of these involved. If not, then they lose out not me...

What better story than a guy comes out of nowhere and builds a better OS than Gates, Balmer, and MSFT.

MSFT is like the holy catholic church in the old days. Everyone wants to break away but no one can figure out how....this OS is the answer.

Instead of branding everything with corporate logo, why not bring the industry players who have a large user base and loyal following? Yahoo is king of messengers, so why not let them do what they do best--maintain and update instant messenging on the Internet, from the PC.

This OS will use a slick UI that relies on GRAPHICAL REPRESENTATIONS OF DATA instead of difficult to remember file directories that more often than not, the user cannot recall or hasn't taken the extra time required to know where he put his favorite YouTube video or the pictures of his kids.

So the file system will be accessed graphically NOT sequentially or linearally.

The main hard disk will be '0'.....any others will be sequentially labeled..........the CD or DVD
drive will be numbered after the last HD; in this case, 1....
I talked to this guy on the phone that wants $6500 plus $44.95 to get me into this deal where I find properties for them for a 1% finder's fee. Hell naw! It would be cheaper for me just to get my own goddamn real estate license and pay the fee for the course and test. What the hell is wrong with people? I'm so sick of all these internet scumbags calling my phone and offering this 'opportunity' or that. Whenever some offers you an 'opportunity,' grab your wallet (if they haven't jacked you already for some money).

The day before this guy called and tried to talk me into paying 90 dollars a month for web hosting. Hell naw! Not gonna happen. I enjoy wasting these people's time. A lot of times I'll pretend that I'm interested just to hang up on them in the middle of the call or whatever. Fuck these hataz.
We should be way ahead of where we are in terms of teaching intelligent systems to THINK, TALK, LEARN, and so on.

New OS

What would a 21st centuy, web 2.o and internet intergrated OS look like?

The OS will be the NETWORK and the PLATFORM, fully intergrated with the WWW. This will be the Vista that never was, or all that V claimed it would be (but we really found out its worse than XP)

Vista will never be adopted mainstream or in enterprise b/c it is full of security flaws and can't use 2 or more cores properly.

64 or better yet 128 bit optimized

New File Storage Methods
Instead of Directory, what about a tag cloud structure (example Digg Labs has one)?

So file retrieval becomes a visually rich and colorful experience opposed to Microsofts DIR structure which is essentially the same in Vista with more colors and 64bit.


1. DESKTOP-the desktop should come with a dazzling arrary of digital art and photography preinstalled, with no links "find more online". Users shouldn't have to PAY for a decent wallpaperr

2. Tab Bar at bottom (annoying as hell and nonfunctional half the time).



3. THAT HORRIBLE START BUTTON---instead, there will be several different categories of buttons that do the functions of the once singular 'start button'...........MUSIC.......Movies..........Games..........Documents...............Internet

4. Mozilla FireFox comes preinstalled instead of IE. There will be absolutely ZERO msoft stuff preinstalled

5. OpenOffice.Org or other WP replaces office

6. Top Tab bar will be an address bar so that you can type in a web address whenever you feel like it, whether or not browser is open

7. Instead of using inhouse apps, we'll use all 3rd party stuff that ACTUALLY WORKS AND HAS CUSTOMER BASE

(because most users will download some or all of these anyway, why not save them download time and effort???

List of Desired 3rd Party Apps that will be Preinstalled:

iTunes and iPhone
Mozilla FireFox
Avant Browser
AVG antivirus Professional
Ruby Editor SCite
LindenLabs Second Life
TVU Networks Broadcaster/Player
Yahoo IM
CounterStrike and Steam
LJournal Client or offline blog client
CC Cleaner
Spybot: Search and Destroy
AVG Anti-Spyware

Keep in mind, this list is by no means exhaustive, and of course we have to get the individual companies approval to feature their software;
however, if the OS works well and looks good, I think we could get 90 percent of these involved. If not, then they lose out not me...
Right now I'm trying to program an AI client I call 'HAL64'. All Rights Reserved. Copyright DMC 2008. Anyway, it was or is inspired by Zabaware's UltraHal 6.1, which I have on my PC. The programmer's name is Robert E. Medzecki and he's 24. He wrote that in VBscript, however I most likely will write HAL64 in whichever language works best for 64bit coding. Perhaps .NET, Java or C Sharp.

It has two work well with 2 or more cores too, which is difficult.


1. optimized for 64bit systems
2. optimized for dualcore and quadcore
3. runs well on Vista
4. intergrated with web browsers, email and IM accounts
5. large knowledge base comprised of minimum 1800+ diverse documents, audio, and video
6. can pass Turing Test
7. can win Loebner award like zabaware did with ultrahal 6.1
8. communicates with some or all of other HAL64 agents all around the world
9. runs in system tray
10. rich GUI using 3d rendering with program like Quest3d or better
11. able to learn new knowledge from user and web automatically with or without user input
12. uses MLN (makhov l networ
Intel currently is doing what they call 'research' on an 80 core computer system. Dubbed 'WildFire', this juggernaut is capable of handling more than 9 million users, and does a hell of a lot more than beat retards at chess......
Almost all social networks like FaceBook MyS are merely tools corporations use to sell and promote their crappy products that no one needs nor wants

C Sharp

Visual C#

C# is a simple, type-safe, object oriented, general-purpose programming language. Visual C# provides code-focused developers with powerful tools and language support to build rich, connected web and client applications on the .NET Framework.
Error running style: S2TIMEOUT: Timeout: 4, URL: rundmc81.livejournal.com/ at /home/lj/src/s2/S2.pm line 531.