One Persons Point of View

PART 1-4

By : Hangtime55


THE Y2K BUG

All worldwide governments and corporations have been waking up lately to the fear of the Y2K bug , or what's better known as the Millennium Bug . Y2K is not one of the negatives but Thee Negative in the worlds last 50 years of positive growth due in part to the enhancements and advantages of today's computer technology.

But as with the advantage of where technology has brought the human race in the last 50 years also comes the disadvantages of applying it to our every form of our personal , commercial and some say spiritual way of life. We can easily, and have become virtually dependent on what makes our lives simply , pleasurable and productive while not being aware of the consequences that every advantage holds . But we as consumers have placed the burdens of these disadvantages in the hands of those who make technology available to us . It is our right to choose what we want or need in our everyday lives , but it is the responsibilities of those who provide us this technology that the assurances in the technology itself will prevail through time with little or no concern of its performance.

Public and private organizations are scrambling to correct the Y2K bug before the millennium comes to a close . With this uncertainty comes the reality that the technology that we have become depended upon may be disrupted if not discontinued for either the short term or a extended period of time.

It has been stated that " programmers in the early days of computers a few decades ago commonly shortened four-digit years to two digits in an effort to conserve a byte or two of disk space and memory - both precious commodities at the time but now as abundant as dirt."

It seems only too real now that this problem , like the many other problems that have been put aside time and time again will come back and be twenty fold in their complexity to correct. Only this time the ultimate fix may come too late, and the world will be the one to suffer its consequences. How could a problem such as the Y2K bug, the very root sickness of mans accomplishments been ignored for so long ?

THE COMPUTER

In the 1930s, punched-card machine techniques had become well established and reliable while several research groups then only dreamed of building a automatic digital computers . One promising machine, constructed of standard electromechanical parts, was built by an International Business Machines ( IBM ) team led by Howard Hathaway Aiken. Aiken's machine, called the Harvard Mark I , handled 23-decimal-place numbers ( words ) and could perform all four arithmetic operations as well as trigonometric functions. The Mark I was originally controlled from prepunched paper tape without provision for reversal, so that automatic " transfer of control " instructions could not be programmed. Output was by card punch and electric typewriter.

In the 1940s, the outbreak of World War II produced a desperate need for computing capability, especially for the military. New weapon systems were produced for which trajectory tables and other essential data were lacking. In 1942, John W. Mauchly, John Presper Eckert, Jr., and their associates at the Moore School of Electrical Engineering of the University of Pennsylvania decided to build a high speed electronic computer to do the job. This machine became known as ENIAC ( Electronic Numerical Integrator And Computer { or Calculator } ) The size of its numerical word was 10 decimal digits, and it could multiply two such numbers at the rate of 300 products per second. ENIAC used 18,000 standard vacuum tubes, occupied 167.3 m squared ( 1,800 square feet ) of floor space, and consumed about 180,000 watts of power. ENIAC is generally acknowledged to be the first successful high-speed electronic digital computer ( EDC ) and was productively used from 1945 to 1955. In 1945 , intrigued by the success of ENIAC, the mathematician John Von Neumann undertook a theoretical study of computation that demonstrated that a computer could have a very simple fixed physical structure and yet be able to execute any kind of computation effectively by means of proper programmed control without the need for changes in hardware. Von Neumann contributed a new understanding of how practical fast computers should be organized and built : these ideas, often referred to as the stored-program-technique, became fundamental for future generations of high-speed digital computers. The machine had a all-purpose ' computer memory ' , which became a assembly place in which parts of a long computation were stored, worked on piecewise, and assembled to form the final results. If the name John Von Neumann sounds familiar , it should be. During WWII, Von Neumann served as a consultant to the armed forces, where his valuable contributions included a proposal of the implosion method for making a nuclear explosion and his assistance in the development of the hydrogen bomb. The first generation of modern programmed electronic computers to take advantage of these improvements appeared in 1947 ( 1947 , this year tends to come up in human history ). This group included computers using random-access memory ( RAM ) , which is a memory designed to give almost constant access to any particular piece of information. These machines had punch-card or punched-tape input and output devices and RAMs of 1,000-word capacity with an access time of 0.5 usec ( 0.5 x 10 to the power of -6 sec.) Physically , they were much more compact than ENIAC. This group of machines included EDVAC and UNIVAC, the ' first commercially available ' computer.

Early in the 1950s two important engineering discoveries changed the image of the electronic computer field, from one of the fast but often unreliable hardware to an image of relatively high reliability and even greater capability. These discoveries were the magnetic-core memory and the transistor-circuit element. These new technical discoveries rapidly found their way into new models of digital computers; RAM ( random access memory ) capacities increased from 8,000 to 64,000 words in commercially available machines by the early 1960s, with access times of 2 or 3 usec. Although these machines were very expensive to purchase or to rent and were especially expensive to operate, this was an opportunity that freed up some ' space ' for use of the four digit representation, instead of the two that were being used to represent years.

In the 1960s , efforts to design and develop the fastest possible computers with the greatest capacity reached a turning point with the completion of the LARC machine for Livermore Radiation Laboratories of the University of California by the Sperry-Rand Corporation . The LARC had a core memory of 98,000 words and multiplied in 10 usec. During this period the major computer manufacturers began to offer a range of computer capabilities and costs, as well as various peripheral equipment - such input means as consoles and card feeders; such output means as page printers, cathode-ray-tube displays, and graphing devices; and optional magnetic-tape and magnetic-disk file storage. These found wide use in business for such applications as accounting, payroll, inventory control, ordering supplies and billing. With the growth in computer technology that seemed to be growing from the previous 10 years, and with the applications that were beginning to be applied to the commercial markets, this was again an opportunity that freed up even more ' space ' for use of the four digit representation to represent years.

The trend during the 1970s was , to some extent, away from extremely powerful , centralized computational centers and toward a broader range of applications for less costly computer systems. Most continuous-process manufacturing, such as petroleum refining and electrical-power distribution systems, now use computers of relatively modest capability for controlling and regulating their activities. Moreover, a new revolution in computer hardware came about, involving miniaturization of computer-logic circuitry and of component manufacture by what are called large-scaled integration , or LSI, techniques. The previous research in 1960 involving photoprinting of conductive circuit boards to eliminate wiring became highly developed. Then it became possible to build resistors and capacitors into the circuitry by photographic means, which is known today as the printed circuit ( boards ). The 1970s research into vacuum deposition of transistors were becoming common, and entire assemblies, such as adders, shifting registers, and counters became available on tiny 'chips'. The unsolved problem with the Y2K bug was beginning to be over shadowed by the anticipated advances in this technology that would soon make it possible to introduce the first personal computer to the public.

In the 1980s very large-scale integration ( VLSI ) , in which hundreds of thousands of transistors are placed on a single chip, became increasingly common. Many companies , some new to the computer field, introduced in the 1970s programmable minicomputers supplied with software packages. The size reduction trend continued with the introduction of the PC ( personal computer ). The revelation of the availability of a computer that the common person could use , little was known of them , how they worked or how they could fail. Like releasing a virus or bacteria into the population, the unsolved Y2K bug was released with little or no concern of what the future consequences would be. Programmers seem to have taken the old approach that " if it ain't broke , then don't fix it" attitude that could possibly come back to haunt both the creator and the users in a now, not so far off future.

THE TRANSITION

As mentioned is that most continuous-process manufacturing, such as petroleum refining and electrical-power distribution systems , now use computers of relatively modest capability for controlling and regulating their activities. Everyone from the privately owned ma and pa family businesses to the Corporate giants of the world are reducing work shifts, shutting down factories and dropping subsidiaries. At the same time, producers of personal computers continued to proliferate. Specialty companies were emerging in increasing numbers, each company devoting itself to some special area of manufacture, distribution, or customer service. Corporations and their three major components of operation - Production , Marketing and Finance - were also beginning to realize the advantages of this new technology.

In production, the manufacturers that had once applied in scarcity, an artificial intelligence to its payrolls with the hopes of improving its productivity under the close supervision of a human presence's, was now reversing the process with the artificial intelligence now keeping a closely watched human operator/supervisor under check. The computer could perform the tasks of 10 people , transcending the quality , the expensiveness and executing it expeditiously. With the predicted long term profit margin and the lower overhead costs leaning toward the side of the manufacturer, there was only one conclusion.

Distribution had already began to expand with the aceleration of production. Personal that were running the front offices found that computer technology was rapidly taking over the workplace. Communications , advertisement and consignments of their improved product were in some part associated with the new age computer technology. The selection and training of personnel was always a continuos process, but personnel management that was once involved in recruiting and training administrators had found their jobs at risk as well. There were fewer positions to be filled and one of their major tasks of negotiating with labor unions utterly vanished with the reduction of personnel due to the increase of technological automation. A few highly trained System Engineers were replacing the duties of the common workplace and the already fewer semi-skilled laborers that were able to survive the onslaught of its newly acquired technology were reserved for conventional duties.

Financial management has always been the emphasis of any growing company. Its duties of searching for adequate sources of capital and the management of the capital already invested were clearly evident with the one time re-tooling of its facilities from human labor to technological robotics. Estimating operations with its new technology was more precise, deciding whether to use short term or long term credit was easily forecasted with the windfall of profits already saved. Budgeting for these profits and planning capital expenditures could be accurately tallied up on a daily basis, thus choosing the right times to issue its stocks or sell its bonds more precisely.

Everything that the computer age had promised was coming into being. It seemed that anything that could have altered the continuous growth by way of technology wasn't evident. Of course none of us truly know the inner workings of the computer. This was left up to the experts. The programmers and engineers forgot to let us in on this one little problem ? This one little ' bug ' that might screw up this whole wonderful world of technology. I think the term " work the bugs out of it " must have come from an experience of long ago. Perhaps even from the very programmers who sought an answer then as, a minor mal-execution . Whether this is due to their procrastination of this error entirely or if it was honestly a vague memory of yesterdays programmer, the true fault still lies in every programmer who knew of the vulnerability in this technology but did nothing.

TECHNOLOGIES FIRST BLACK PLAGUE

We as a race have allowed the computer to control almost every facet of our lives. Whether this was meant to be or if we allowed it to happen , nevertheless it is occurring. The computer age came upon us so rapidly and conclusively, that we bare some responsibility for it. In the workplace where jobs have been taken away from man and offered to technology like a lamb on the alter , we are only now feeling the advantages, as well as the misgivings from it. The computer age has spread like a plague throughout the world. And like a plague, certain precautions were not forecasted. It has been through human history that we have learned that the human race is constantly in danger of being infected with many strains of viruses and bacteria. When these infections become frantic and unattended to, plague usually results from it. We have learned the basic procedures in which we as human beings can reduce the risk of possibly contracting any sort of disease. But still , even those steps are not enough.

The Y2K bug might very well become the worlds first deadly plague , or ' Black Death ' of the computer age. Like the Bubinic Plagues of the 6th and 14th centuries, the 6th century plague killed an estimated 100 million people while the 14th century plague killed an estimated 75 million people. The Y2K bug is obviously not a bacteria or infection that can affect a human beings health and cause them to fall and die in their footsteps. The point here is that a bubonic plague that affected that many people in those early times was transmitted by close contact. It was estimated that if a victim fell ill to its affects , and unless the victim seeked some sort of medical help within the first 15 hours of the transmission of the disease, then there was a 60-90 % chance of that victim dying within a few days . This is a very close analogy to what the black death of the computer age can create. Computers are relatively young in the history of man. Although we have had our small outbreaks of computer viruses in both irregular and systematic intervals, warnings and symptoms of these types of computer viruses have lead to cures for them. Like your own computers at home , unless you knew something about their operations and the slight differences, if any , in their operations during use, you would never know that you had been infected. Your computer can be infected by a virus either through another computers corrupted files, e-mail and copied software programs that haven't been properly scanned for such infections, but especially and primarily over the Internet where this type of close contact with other computers , or in a analogistic view , with other previously infected victims has taken place.

The sad and unfortunate cause of the Y2K bug was not our education , training, awareness or even a separate computer virus that has occurred , but rather the simply excuse that programmers would like us to believe. They had forgotten about it ? I can't see how this poor and irresponsible explanation of one , if not the biggest, foul ups in human history could simply be explained and accepted as " I forgot."

Like in human Law when a person with HIV transmits their disease purposely and willingly without mention of it to their partner, it is a crime. I would imagine that if they pleaded " I forgot " as their defense , then they would be maliciously displaying their ignorance and their guilt. However, it would be a whole different story if a person honestly didn't know that they had contracted the HIV in a earlier time in their lives. As for Y2K , this wasn't an extraneous flaw in computer technology. It was a known consequence and a acknowledged omen that everyone, from the researchers,developers, manufacturers, programmers and the retail outlets that this flaw in technology, gone dismissed, would have a catastrophic result in the future.

THE MEDIA

Many news outlets like network television news broadcasts and city newspapers in the U.S. have reported and printed articles about the Y2K bug now for about the last 6 months. The media's Science analysists are only now informing us on the implications of the Y2K bug , attempting to take the side of righteous reporting that has lacked distinction ever since in my thoughts, the last mission of Apollo 17 splashed down on December 19, 1972. Only now are they reporting to their viewers and readers the sudden revelations that may very well affect every one of us. Where was the In Depth reporting before the Y2K issue became a public concern ? Like many of our newsgroups that claim to inform and educate us on the events of our times , they usually tend to take the nonchalant view of explosive controversies , being very careful not to step on the toes of those who tell them what to report and how to report it to the general public.

Where has the Science Departments of our most prominent network news stations and city newspapers been for the last 25 years ? The Face on Mars ? The Phoenix Lights ? UFOs ? These are events that even I must say aren't in your everyday discussion moments at the nearest water cooler at the office. Is this because topics like these are of the irrational type, or is it because our yearning for news of this kind is censored from all major media conduits by those who have very short toes.

In a time in our lives when public opinion polls show that over 50% of adult Americans think UFOs are "real," over 70% think the government is hiding information about UFOs, and 40 % think that the government is concealing information proving the existence of UFOs, the lack of concern or attention placed on the sharing of this new kind of information is not that of the reader but is the fault of the reporter. The proclamation of ' Freedom of the Press ' that cried out all across this new great country over 200 years ago at its time of birth was a testimony to all that governed that the right of every individual who in part, seeked knowledge and condemned anarchy was represented in a publicized forum for all to witness and distinguish as what was best for and necessary in preserving what new independence had been conceived. It was a methodical process that allowed for the 'checks and balances' against what a small population had fled from years earlier. As long as the people were kept informed of its governing bodies, social issues and current affairs of their nation, then peace within its own standings, agreed upon, would continue to prevail. Take this away, or alter its rudimentry form and opposition will soon ensue.

cont...

PART 2 | MAIN PAGE