Email and Forum Questions Profiles in IT: John V. Blankenbaker Forbes Report: America's Most Wired Cities USB 3.0 Should Hit the Shelves in 2009 Update: Digital TV Conversion BMW Studies Car-to-Car Communication WikiDashboard Released by PARC Cox Communication Monitors Internet Access Book of the Week: The Medici Effect by Frans Johansson (2006) Biomimicry is an example of the Medici Effect Food Science: Convection Ovens
Email from John: Dear Tech Talk I know that all computers have a unique MAC address. But how traceable are they? If my laptop gets stolen, and I know my MAC address, can I get back to it if the person stole it gets connected to internet? Thanks, John.
Tech Talk Answers: MAC stands for Media Access Control it is the hardware address that is permanently assigned to your network interface card. It is a physical address, as opposed to a logical address which depends on your location in the network. The MAC address should be unique. However, it is possible to spoof the MAC address with a software override.
The MAC address is used by the network to identify which piece of hardware a packet of information is to be sent to. In other words, it’s used only on connections from one piece of networking equipment to the next.
That means that when information leaves your computer it has your computer’s MAC address, but when it arrives at your router that MAC address is removed. When the information is sent by your router further upstream to your ISP’s router, it contains the MAC address of your router. When it moves from the ISPs router to another router on the internet, it contains the MAC address of the ISPs router. And so on.
There are various alternative solutions that you might want to look into for stolen laptop recovery. While not all survive a reformat of the machine, many thieves actually try to connect a stolen laptop as-is in order to recover or steal the data thereon as well, at which point these tracing tools kick in.
Profiles in IT: John V. Blankenbaker
John Blankenbaker created Kenbak-1, the world’s first personal computer, advertised for $750 in Scientific American.
The Kenbak-1 was designed in 1970 and pre-dated microprocessors.
John Blankenbaker was born in Berkley County, South Carolina on July 4th 1933.
He received a BS in Physics and Math from Oregon State College in 1952 and an MS in Physics from UCLA, and an MSEE from MIT.
John Blankenbaker started the design of a computing device in 1949, when he was a freshman at OSC, to calculate logarithms for his weekly physics lab.
In the summer of 1951, John worked for NBS on the Standards Eastern Automatic Computer (SEAC).
After graduation from OSU in 1952, he worked at Hughes Aircraft Company and was assigned to a department working on digital computers.
At Hughes, he found that a computer design needed at the most one flip-flop if it had the appropriate memory. This was a spur to design a computer for private use.
After the Hughes business data processor unit was terminated, he returned to school and earned earned an MSEE from MIT
After a short period of consulting, he worked eight years at Scantlin Electronics, an early pioneer in real time communications to bring stock market prices to brokers.
In 1970, he founded Kenbak Corporation and designed a small computer, the Kenbak-1, which was based on small-scale integrated circuits.
Using standard medium-scale and small-scale integrated circuits, the Kenbak-1 relied on switches for input and lights for output from its 256-byte memory.
Instead of being microprocessor based, Kenbak-1 was built almost entirely from TTL.
Intel 4004 (the worlds first microprocessor) was introduced the next year in 1971.
Kenbak-1 was a true stored-program computer that offered 256 bytes of memory, a wide variety of operations and a speed equivalent to nearly 1MHz.
One year later he sold the first two of these to a private girl’s school.
Approximately 40 of these machines were built and sold for $750, mostly to schools.
The largest program Blankenbaker ever wrote for the Kenbak-1 which took the very last byte of memory was a program to play 3D tic-tac-toe (4 x.4 x 4).
Actually, it was a bit short of memory as there was not room for the program to recognize when someone had won.
The world just wasn’t quite ready for personal computing and the Kenbak-1 lacked some critical capabilities (such as expandability and I/O).
The slot on the front panel was presumably intended to account for these deficiencies.
In 1973, after selling only 40 machines, Kenbak Corp. closed its doors.
Kenbak-1 was purchased by CT. Educational products and renamed CTI 5050.
The Computer Museum of Boston judged this to be ?the first commercially available personal computer.?
Blankenship worked for International Communications Sciences on a system to transmit voices over 9600 bps line and for Symbolics Coporation to create LISP computer.
He retired in 1985 and works with Germana Colonies, a genealogy organization.
USB was created by the core group of companies that consisted of Intel, Compaq, Microsoft, Digital, IBM, and Northern Telecom.
The USB 1.0 specification model was introduced in 1996
Originally USB was intended to replace the multitude of connectors at the back of PCs, as well as to simplify software configuration of communication devices.
USB 1.1 came out in September 1998 with a data rate of 12 Mbits/s.
The USB 2.0 specification was released in April 2000 and was standardized by the USB-IF at the end of 2001. It operated at a data rate of 480 Mbit/s.
The USB 3.0 specification was released on November 17, 2008 by the USB 3.0 Promoter Group. It has a transfer rate of 5 Gbits/s (10 X USB 2.0) and has been dubbed the SuperSpeed USB.
Update: Digital TV Conversion
Congress has officially pushed back the analog to digital television transition from February 17 to June 12.
However, FCC may allow broadcasters to make the DTV transition earlier.
Any station planning to switch on Feb. 17th must contact the FCC by COB Feb. 9th.
Major broadcast networks, including ABC, CBS, Fox and NBC, will not switch off analog signals until this spring, though local affiliates are free to do as they please.
The FCC has the final call on each station, and can decide a station must wait to make the switch if they don’t do enough to inform their viewers.
Numerous stations have come forward and said they will drop analog broadcasts in less than two weeks.
There are 1,796 full-power TV stations in operation in the United States, with hundreds expected to switch over on the original date.
BMW Studies Car-to-Car Communication
On a frosty morning, imagine if the car 100 feet ahead of you could somehow alert you to black ice on an off-ramp. You’d slow down, and your car’s electronic stability system could even take preliminary steps to anticipate the situation.
Car-to-car communication, the next step in safety technology.
The Center for Automotive Research have discusses this for years.
There is even a federal program called Intelligent Transportation Systems.
According to VP of engineering Tom Baloga, BMW’s progress toward car-to-car communication is ?moving forward very well.
U.S. automakers have agreed upon a standardized frequency ? 5.9 GHz ? regardless of the car. 5.9 GHz is the same frequency European cars use.
The car is going to act like a data-collection probe. The car’s location will be transmitted to other cars and to an infrastructure.
This data will be used to identify traffic flow, slippery conditions, and bottlenecks.
Maintenance crews could find pothole-ridden areas based on suspension kinematics data, while salt crews could deduce which streets were especially icy using data from antilock braking or electronic stability systems.
Naturally, there’s another side to this: How much do you want on the public record about your car ? and, by extension, your driving habits?.
WikiDashboard Released by PARC
Palo Alto Research Center (PARC has now created a tool, called WikiDashboard, that is designed to reveal much of the normally hidden back-and-forth behind Wikipedia’s most controversial pages.
Wikipedia already has procedures in place designed to alert readers to potential problems with an entry.
For example, one of Wikipedia’s volunteer editors can review an article and tag it as "controversial" or warn that it "needs sources."
WikiDashboard shows which users have contributed most edits to a page, what percentage of the edits each person is responsible for, and when editors have been most active.
A WikiDashboard user can explore further by clicking on a particular editor’s name to see, for example, how involved he or she has been with other articles. Chi says that the goal is to show the social interaction going on around the entry.
For instance, the chart should make it clear when a single user has been dominating a page, or when a flurry of activity has exploded around a particularly contentious article.
The timeline on the chart can also show how long a page has been neglected.
Cox Communications, the nation’s third-largest cable company, on Tuesday unveiled a plan to monitor and slow Internet content it deems unimportant.
Cox joins the ranks of other Internet providers willing to tempt legal fate by getting between customers and their access to the free-flowing Web.
Comcast ? which the FCC sanctioned last year for just this type of interference ? had secretly blocked access to legal file-sharing applications to users the cable giant deemed ?bandwidth hogs.?
Comcast reportedly has now joined AT&T in a new effort to filter Web traffic for files deemed inappropriate by movie and recording industry lawyers.
Cox has decided that certain Web traffic is ?less time-sensitive,? and will be blocked in favor of other ?more timely? content during periods of high congestion.
They plan to test this system on their lucky customers in Kansas and Arkansas.
Most ISPs will be using deep packet inspection, or DPI, which allows network managers to inspect, track and target user messages.
DPI is the Internet equivalent of the mailman opening and reading your mail to decide whether or not to deliver it.
Last year, ISPs declared before Congress that they were siding with Internet users and ?keeping their distance? from DPI. But we did our own deep packet inspection and found that the network providers’ actions often speak louder than their testimony.
Book of the Week: The Medici Effect by Frans Johansson (2006)
Johansson, founder and former CEO of an enterprise software company, argues that innovations occur when people approach situations with an eye toward putting available materials together in new combinations.
The Intersection of two disciplines is the best place to innovate.
The Rise of Intersections
Movement of People
Convergence of Science
Leap in Computation Capability
Creating the Medici Effect
Breaking Down the Barrier Between Fields
Expose Yourself to a Range of Cultures
Randomly Combine Concepts
Brainstorm to Ignite an Explosion of Ideas
Combining disciplines has the potential to make your more productive and unique.
Biomimicry is an example of the Medici Effect
Biomimicry is the practice of developing sustainable human technologies inspired by nature.
The most famous example of biomimicry was the invention of Velcro brand fasteners. Invented in 1941 by Swiss engineer George de Mestral.
He took the idea from the burrs that stuck to his dog’s hair.
Under the microscope he noted the tiny hooks on the end of the burr’s spines that caught anything with a loop – such as clothing, hair or animal fur.
Gecko Tape is a material covered with nanoscopic hairs that mimic those found on the feet of gecko lizards.
These millions of tiny, flexible hairs exert van der Waals forces that provide a powerful adhesive effect.
The allow you to stick to anything even underwater or in space.
Inspired by the evolved ability of shark’s skin to reduce drag by manipulating the boundary layer flow as the fish swims, researchers are developing coatings for ship’s hulls, submarines, aircraft fuselage, and even swimwear for humans.
Based on the varying shape and texture of shark’s skin over its body, Speedo’s Fastskin FSII swimsuits made their appearance at the Bejing Olympics and may have helped US swimmer Michael Phelps to his record eight gold medals in that competition, and the rest of the team as well.
Food Science: Convection Ovens
Convection ovens carefully create a uniform temperature with internal fans that circulate hot air.
Convection ovens are larger and more expensive than standard, or radiant, ovens, but they cook food faster, at a lower temperature, and with better results.
Fans ensure that the same temperature reaches the top and bottom of foods, as well as foods at all rack levels.
A frequent complaint of cooks with radiant ovens is that bottoms of foods get scorched, while tops are not browned evenly. Convection ovens correct this by using a fan that blows the hot air throughout the oven.
A "true" or "European" convection oven goes one step further by adding a third heating element. Thus, the fan actually blows pre-heated air, rather than distributing the already-heated air. These are the most expensive and effective types of ovens.
Most recipes can be cooked for 25% shorter time, which ends up saving energy. Also, you might need to slightly lower the temperature at which food cooks on a trial and error basis.
You’ll notice that convection ovens seal in the juices of meat so dishes taste more flavorful and moist. Baked goods, such as pies or cookies, will be perfectly browned, even if you place them on different racks.
Pastry will come out better, too, because the heat doesn’t fuse flour and butter, but allows it to form flakes.