Email and Forum Profiles in IT: Sam DiVita and Richard Sturzebecher Digital Universe Bigger Than Thought Sony is offering to remove some of the trial software www.donotreply.com Website of the Week: Questionaut NSA Security Configuration Guides Great resources for your IT career advancement Wireless auction yields mixed results for consumers The Importance of Virtualization White House Taps Tech Entrepreneur for Cyber Defense Post Devices to Drive People Crazy Possible Salt Flats on Mars More Online App to replace Client Site Apps
Email from Victoria: Dear Doc, I wonder if you can answer a question for me. My computer screen has become dim. Why would this be? Everything’s plugged in. Is it dying? Or is there a way to lighten it that I don’t know about? But, then, why would it have gone dim on its own in the first place? Victoria.
Tech Talk Answers: Here are three things to consider.
The illumination level of a laptop is adjustable. Usually one function key increases it and one reduces the level. It may ctrl-function or alt-function or shift-function, depending on your laptop. You will need to check the manual.
The energy saver options may be set for reduced light levels. You can reach energy saver through the control panel. It may be set to save energy, even when plugged in.
Finally, all laptops are set for low illumination level when they are unplugged in order to lengthen battery life. You can override this using the energy saver option, but I don’t recommend it.
Victoria answers: You’re a genius! Shift-function plus a button with the sun on it and an arrow and I can see for the first time in weeks! Wow! Thank you. Victoria
Profiles in IT: Sam DiVita and Richard Sturzebecher
Sam DiVita was the inspiration behind the idea to use fused silica for Fiber Optics.
Sam was Manager of Material Research at the US Army Signal Corps Laboratory at Fort Monmouth, New Jersey.
In 1958, he was asked to develop a replacement for copper cable and wire.
Sam thought glass fiber and light signals might work, but the engineers who worked for Sam told him a glass fiber would break!
In September 1959, Sam DiVita talked to 2nd Lt. Richard Sturzebecher. Richard had melted 3 triaxial glass systems, using SiO2, for his 1958 senior thesis at Alfred University under Dr. Harold Simpson, Professor of Glass Technology.
Richard knew that ultra pure SiO2 was very transparent and that Corning Glass made high purity SiO2 powder, by oxidizing pure SiCl4 into SiO2.
In 1961 and 1962, the idea of using high purity SiO2 for a glass fiber to transmit light was made public information in a bid solicitation to all research laboratories.
Sam awarded the contract to the Corning Glass Works in Corning, New York in 1962. Federal funding for glass fiber optics at Corning was about $1,000,000 between 1963 and 1970.
In 1966 Kao and Hockham proposed optical fibres at STC Laboratories (STL), Harlow, when they showed that the losses of 1000 db/km in existing glass (compared to 5-10 db/km in coaxial cable) was due to contaminants, which could potentially be removed.
In 1970, Corning Labs designed and produced Optical Waveguide Fibers made of fused silica, through which at least 1% of light remained intact after traveling one kilometer (20 db/km)
The Corning team was comprised of Robert Maurer (Ph.D. MIT, 1951), Donald Keck (Ph.D. Michigan State University, 1967), and Peter Schultz (Ph.D. Rutgers University, 1967) was awarded the patent (Patent #3,711,262).
On 22 April, 1977, General Telephone and Electronics sent the first live telephone traffic through fiber optics, at 6 Mbit/s, in Long Beach, California.
In 1980, the First generation fiber-optic communication system operated at a wavelength of 0.8 µm and used GaAs semiconductor lasers. This first generation system operated at a bit rate of 45 Mbit/s with repeater spacing of 10 km.
The first transatlantic telephone cable to use optical fiber was TAT-8, based on Desurvire optimized laser amplification technology. It went into operation in 1988.
The Second-generation of fiber-optic communication was developed for commercial use in the early 1980s, operated at 1.3 µm, and used InGaAsP semiconductor lasers and had bit rates of up to 1.7 Gb/s with repeater spacing up to 50 km.
Third-generation fiber-optic systems operated at 1.55 µm and had loss of about 0.2 dB/km and use single-mode and low dispersion fiber. 3rd generation systems to operate commercially at 2.5 Gbit/s with repeater spacing in excess of 100 km.
Fourth-generation of fiber-optic communication systems use optical amplification to reduce the need for repeaters and wavelength-division multiplexing to increase fiber capacity. Bit-rates of up to 14 Tbit/s have been reached over a single 160 km line using optical amplifiers.
The rest is history. Sam worked at Fort Monmouth until 87.
Digital Universe Bigger Than Thought
A white paper released last week from IDC revised the research firm’s earlier estimates to show that by 2011, the amount of electronic data created and stored will grow to 10 times the 180 exabytes that existed in 2006, reflecting a compound annual growth rate of almost 60%.
By 2011, there will be 1,800 exabytes of electronic data in existence, or 1.8 zettabytes
The study, "The Diverse and Exploding Digital Universe," also found that the rate at which electronic containers for that data ? files, images, packets and tag contents ? is growing 50% faster than the data itself.
And in the year 2011, data will be contained in more than 20 quadrillion ? 20 million billion ? of those containers, creating a tremendous management challenge for both businesses and consumers.
Along with the new white paper, which was sponsored by EMC Corp., IDC created a "Personal Digital Footprint Calculator," which resides on the same Web page as EMC’s worldwide digital growth-tracking ticker, which counts the amount of data created second by second.
IDC said the bigger numbers were the result of faster growth in digital cameras and televisions, as well as a better understanding of data replication.
Less than half of the digital data being created by individuals can be accounted for by user activities, such as photos, phone calls and e-mails.
The majority of the data is made up by what IDC calls digital "shadows" ? including surveillance photos, Web search histories, financial transaction journals and mailing lists.
Sony is offering to remove some of the trial software
Buyers of the configure-to-order versions of its Vaio TZ2000 and Vaio TZ2500 laptops can opt to have Sony remove the some of its own applications.
The "Fresh Start" option, billed as a software optimization, costs $49.99, and is only available to customers choosing to pay an additional $100 to upgrade the operating system to Windows Vista Business.
PC manufacturers are often paid by software publishers to include such trial versions on the computers they ship.
Bloatware (also called craplets or crapware) poses problems for businesses because it reduces system performance and available hard disk space.
Dell was one of the first PC manufacturers to offer to remove bloatware.
Everex followed suit a week later, saying it would eliminate bloatware from a $300 desktop machine for consumers.
Update: Responding to a tidal wave of outrage, Sony has reversed a plan to charge $50 to remove all the pre-installed applications from its high-end TZ-series notebooks.
As owner of www.donotreply.com, the Seattle-based programmer receives millions of wayward e-mails each week.
Many companies us email@example.com to indicate that the receiver should not reply to this address. However, many do.
The majority of the e-mails naturally are from spammers.
But many of the misdirected e-mails amount to serious security and privacy violations.
In February, Faliszek began receiving e-mails sent by Yardville National Bank in New Jersey (now part of PNC). Included in the message were PDF documents detailing every computer the bank owned that was not currently patched against the latest security vulnerabilities.
Faliszek posted another bank screw up last month after he began receiving replies from Capital One customers inquiring about various details of their accounts.
Faliszek bought donotreply.com back in 2000 when he and some friends were running an e-mail service. "We all thought it would be funny to send e-mail from an account at donotreply.com," Faliszek said.
Go to www.donotreply.com to see more errant emails in Faliszek’s blog.
Operating Systems (Mac, Windows, Sun Solaris, Redhat Linux)
Switches (Cisco IOS)
VoIP and IP Telephony (Deployment and Architecture)
Web Servers and Browsers (Microsoft IIS, IE5.5, Netscape)
Wireless (Bluetooth and 802.11)
Great resources for your IT career advancement
Google versus Yahoo and Leadership
Google’s business model of internet-search-driven advertising has become so dominant that competitors Microsoft and Yahoo can hardly compete.
When Google founders Sergey Brin and Larry Page wanted a C.E.O. for their rapidly growing company in 2001, they turned to a technology executive, Eric Schmidt, who had previously worked at Sun Microsystems and Novell.
Coincidentally, Yahoo co-founders Jerry Yang and David Filo were also looking for a C.E.O. that year, and they picked a Hollywood insider: Terry Semel, who had run Warner Bros.
Hollywood failed; technology prevailed.
Since signing on with Google, Schmidt, 52, has channeled the founders’ strategic vision and the company’s technological assets to create a Web-search and online-advertising company with $5.7 billion in profits in 2007.
Yahoo, meanwhile, has fallen behind Google technologically and is now fighting a hostile takeover by Microsoft. Semel quit last year.
Wireless auction yields mixed results for consumers
The700MHz wireless spectrum auction closed last Thursday.
For the first time in such an auction, the FCC required winners of some of the spectrum to allow any phone and any application to run on their new networks.
These "open access" terms mean that end users should be able to choose from a wider selection of devices, along with new types of Web 2.0 services to run on them.
The change affects mainly Verizon, which won almost all of the licenses that must follow the open access rules.
Google entered the auction but did not win any licenses, although its participation was seen by many as way to promote the open access requirement, rather than as an attempt to become a network operator.
Verizon and AT&T, another big winner, will most likely use the spectrum to offer high-speed data services to provide an alternative to cable or DSL. This will be a boon for rural residents who have sometimes trouble getting broadband.
The networks will probably use the new LTE (Long Term Evolution) cellular technology.
The new networks are unlikely to deliver cheaper services for users as some had hoped, however, at least not for a while. The operators will need to pay off the billions of dollars they pledged for the spectrum, in addition to the investment in the new networks.
In a blog post, Google called it a victory for end users.
The Importance of Virtualization
Virtualization essentially lets one computer do the job of multiple computers, by sharing the resources of a single computer across multiple environments.
Virtual servers and virtual desktops lets you host multiple operating systems and multiple applications locally and in remote locations, freeing you from physical and geographical limitations.
In addition to energy savings and lower capital expenses due to more efficient use of your hardware resources, you get high availability of resources, better desktop management, increased security, and improved disaster recovery processes when you build a virtual infrastructure.
Virtualization is a proven software technology that is rapidly transforming the IT landscape and fundamentally changing the way that peoplecompute.
Today’s powerful x86 computer hardware was originally designed to run only a single operating system and a single application, but virtualization breaks that bond, making it possible to run multiple operating systems and multiple applications on the same computer at the same time, increasing the utilization and flexibility of hardware.
Virtualization is used to scale server rooms, to allow multiple-operating systems on the same client.
The latest chips support virtualization (both Intel and AMD)
Then install multiple operating system on top of this virtual interface. Each OS will think that it is the only OS operating. If not other systems are in use, it gets all the resources.
However, over commitment of resources is a hot topic being debated right now.
White House Taps Tech Entrepreneur for Cyber Defense Post
White House is expected to announce the selection of Rod A. Beckstrom to head a new inter-agency group charged with coordinating the federal government’s efforts to protect its computer networks from organized cyber attacks.
Beckstrom is an author and entrepreneur best known for starting Twiki.net, a company that provides collaboration software for businesses.
The new inter-agency group, which will coordinate information sharing about cyber attacks aimed at government networks, is being created as part of a government-wide "cyber initiative."
The presidential directive expanded the intelligence community’s role in monitoring Internet traffic to protect against a rising number of attacks on federal agencies’ computer systems.
Beckstrom will report directly to Homeland Security Secretary Michael Chertoff.
The government has acknowledged that its information systems have been the target of repeated cyber attacks originating in other counties.
By all accounts, Beckstrom is neither a cyber-security expert nor a Washington insider. But his private-sector background and published writings emphasize a decentralized approach to managing large organizations.
Devices to Drive People Crazy
The Phantom Keystroker
Attach this evil prank device to your victim’s computer and it makes random mouse movements and types out odd garbage text and phrases.
Just plug it into any USB port.
No drivers needed. Just plug and go.
Even better than the $10 Annoy-a-tron, which generates a short beep every few minutes. Your target will have a hard time ‘timing’ the location of the sound because the beeps will vary in intervals ranging from 2 to 8 minutes.
Satellite imagery reveals thick Martian salt deposits scattered across the planet’s southern surface.
The mats of sodium chloride serve as more evidence of Martian water which could have been hospitable to life.
The salt deposits probably formed from dried-up brine pools, which would not have been as acidic as other places on Mars where water is thought to have existed, such as clay and hydrated mineral deposits.
Sites such as those found by the Mars Exploration Rovers show sulfur in high levels, which means any water there may have been too harsh to support life.
He added that some of the oldest organisms ever discovered on Earth have been found locked away in salt crystals.
Using the Mars Odyssey orbiter’s Thermal Emission Imaging System (THEMIS), the NASA research team found dozens of strange sites in a belt just south of Mars’ equator.
It took them a couple of years to figure out what they were.
These will be good future sites for a Mars rover to check out.
More Online App to replace Client Site Apps
More Online App emerge to populate the Cloud
The emergence of broadband connections has made these practical
Watch for low cost Internet computers with a Linux OS.