The Bearable Lightness of Blimps

Enviromentally Conscious Airfreight alternative: The Blimp

An airfreight alternative by CargoLifter AG

Professor Sir David King, former UK scientific adviser addressed the World Forum on Enterprise and Environment in Oxford earlier this month. Professor King said that helium-filled blimps are a practical means* of transporting high value perishable goods such as fresh fruit, vegetables and cut flowers.

Several major air and transit companies, including Lockheed Martin, Boeing and CargoLifter AG, (see photograph above) have designs in progress. The Guardian reported that U.K. government funding has been allocated for development, or rather, re-development, of this 70-year old technology with the intent of adaptation for use in modern contexts. King indicated that implementation could lead to widespread usage of blimps for freight transport as soon as 2020. While that seems rather optimistic, it is conceivable for specialized commercial purposes such as transport of greenhouse flowers from The Netherlands to North America.

A few matters for investigation:

  • What will power the blimp motors? Blimps are not like hot-air balloons, and most motors need fossil fuels for power.
  • Blimp infrastructure for landing, take-off and traffic control, would require significant investment. There are functioning blimps, presumably for advertising purposes, in regular use today. The Goodyear Blimp needs to take off and land somewhere, so some bases must exist. There was a large blimp hangar in Lakehurst, New Jersey, and another in Sunnyvale, California which may still be serviceable.
  • Blimp pilots. Again, there are individuals flying blimps now. However, the Federal Aviation Administration (FAA) would need to set up a variety of new guidelines and training programs.

*Helium is the safe alternative to the flammable hydrogen associated with the Hindenburg disaster in the years before World War II.

Published in: on July 25, 2010 at 3:14 am  Leave a Comment  
Tags: , , , ,

Amazon Transfers Data Faster than the Speed of Light?

Actually, Amazon doesn’t transfer data faster than the speed of light. But Amazon Web Services (AWS) does have a high-speed internal network that will move customer data faster than the internet. To the dilettantes in our midst, that would imply a transfer rate faster than the speed at which electromagnetic radiation travels, which in theory should be the same as the speed of light. And similarly, it should be the speed at which signals pass through the “ether of the internet”. However, this isn’t quite the case.

Gratuitous illustration of electromagnetic spectrum

Data transfer: passing through the historical aether

Amazon’s breakthrough technology, providing both storage and processing services “in the cloud” is not particularly recent news. The beta release occurred over a year ago. The salient part of the story is here:

We are excited to announce the limited beta of AWS Import/Export, a new offering that accelerates moving large amounts of data into and out of AWS using portable storage devices for transport. AWS transfers your data directly on and off storage devices using Amazon’s high-speed internal network and bypassing the Internet. For significant data sets, AWS Import/Export is often faster than Internet transfer and more cost-effective than upgrading your connectivity.

AWS is now rolling out the Import/Export for Amazon S3, a premium storage solution designed for mission-critical and primary data storage. What are the portable storage devices referred to above? Any storage device with a USB or eSATA connector, that draws power from a U.S.-standard wall socket plug, 120 Volts at 60 Hertz. Amazon’s high-speed internal network is not electronic: it is internal combustion powered. Yes, the reference to the “Amazon high-speed internal network” is the Amazon internal network of vehicles, probably trucks. Really quite sensible of Amazon. The AWS Import/Export Calculator helps users decide the cost-equivalency between Amazon S3 data transfer charges and time versus directly loading data from an AWS Import/Export facility, at rates as fast as 500 Mbps.

If you have large amounts of data to load and an Internet connection with limited bandwidth, the time required to prepare and ship a portable storage device to AWS can be a small percentage of the time it would take to transfer your data over the internet. If loading your data over the Internet would take a week or more, you should consider using AWS Import/Export.

Why not combine old and new technologies to give the best possible service to your customers? AWS is offering free S3 data transfer in through November 2010. After that, the AWS Import/Export service should become yet more interesting as an option to increase price-performance for data storage in the clouds.

LATE BREAKING NEWS FLASH: Check out these AWS sticker photos! Delivered the old-fashioned way: by internal combustion, jet engine and/or freight train powered snail-mail. And if you want some too, you merely need to do as suggested by AWS Evangelist Jeff Barr and they WILL be yours. If only all wishes were so easily granted.

Published in: on July 22, 2010 at 11:42 am  Leave a Comment  
Tags: , , ,

Random Numbers in Java

This post is about alternatives to the java.util.Random class, the most commonly used method to generate random numbers in Java.

Dice players of antiquity

Random: Playing the odds in Ancient Rome Osteria della Via de Mercurio

 java.util.Random

This class generates “random enough” values, which are described as having an informal level of randomness. Such numbers have a superficial appearance of randomness to an observer, usually a human observer, not a machine. These are considered low-quality random numbers. They should never be used in any security application.

True randomly-generated numbers must have the following characteristics:

  • Any number is equally likely to be generated on every iteration. This is also known as a uniformly distributed series of numbers.
  • The second criterion follows directly: No dependency exists between successive numbers as they are generated.

Alternatives to better match your random needs

Neil Coffey of Javamex describes three alternative random number generation methods. Each approach continues to use class java.util.Random, while replacing the underlying algorithm. The first, called the XORShift generator, produces medium-quality random numbers very quickly, with only a single state variable and very simple code. This method is very well suited to J2ME games.

The next algorithm generates much higher-quality random numbers. It is a combined generator, using two XORShift generators of the sort described above. Mr. Coffey provides the code and explanation for the algorithm. This combined XORShift yields good-quality random numbers. It is suitable for non-gambling games and simulations, although it runs slightly slower than java.util.Random.

A cryptographic quality random number generator should have the following properties:

  1. It should be impossible to predict prior and future numbers from any number generated;
  2. The numbers should have no discernible biases;
  3. The generator has a large period;
  4. The generator can seed itself at any position within that period with equal probability.

A cryptographically secure random number generator is appropriate for security applications such as producing a web-server session id or picking encryption keys. Very high-quality random numbers are generated using java.security.SecureRandom as the replacement for java.util.Random. The trade-off in quality versus CPU cycle consumption is hardly surprising.  java.security.SecureRandom runs 20 to 30 times slower than any of the other algorithms.

Relative comparison between the methods

Here is a very simplified way of understanding the difference between each algorithm.

Let’s say that we need to generate a 128-bit encryption key. java.security.SecureRandom actually picks from a pool of 2 raised to the 127th power number of possible keys. Of course java.util.Random can also be used to generate a 128-bit key. However, the values will be selected from a smaller pool of numbers, on the order of 2 raised to the 47th power number of possible keys. This is because java.util.Random has a much shorter period, equal to 2 raised to the 48th power.

The single XORShift generator method falls between the two, as it has a slightly longer period, of 2 raised to the 64th power. The combined XORShift generator approach extends the period a bit further.

Note than neither java.util.Random nor either of the XORShift generators are seeded randomly. This is why java.security.SecureRandom, with a machine-generated and much more truly random random seed, is superior.

* The machine-generated random seed is what is called entropy in random number generators.

Green Computing of the Future

Google’s massive computing facilities use 12V auto batteries as back-up power source! Is it true? Yes, per David Kanter, editor-in-chief of Real World Technologies.
Google, much like Amazon, Yahoo! or Facebook relies on aggregate computing processing power. Cost remains an issue, despite increased hardware affordability and performance advantages from better systems architecture. This might be old news at Slashdot, as it was based on events of perhaps mid-year 2009. It was new to me though.

“Google shared some of its Borg hardware with the world… a 12V car battery was used for backup, rather than an UPS. Why? Modern UPS designs improve reliability in exactly the same way, but the conversion efficiency is not perfect. A good UPS might convert power at 90% efficiency. For a company with a dozen servers, a 10% inefficiency power is unfortunate, but may not be worth chasing after… [Google’s] car battery approach achieves 99.9% efficiency, and with hundreds of thousands of servers, that can make a big difference.”

I’d like to see the numbers. Reliability is based on expected failure rates. Without knowing more about the probability distribution for 12V car battery failure, and how multiple 12V batteries are configured for back-up power, I don’t have sufficient information to validate. Other considerations would be: Exactly how many 12V batteries does Google use? How much space do they require and isn’t there some trade-off due to the enormous footprint of a vast 12V battery farm? If feasible, it would be a wonderfully innovative and clever solution, using a relatively more environmentally friendly and functionally superior design for back-up power.

Here’s another example of energy-efficiency:

“Google simplifies their designs to only use 12V power rails. A modern motherboard requires several voltage planes: 3.3V, 5V and 12V. To provide each voltage, the power supply needs to convert 115V AC to the appropriate level. Instead, Google outputs 12V and uses high efficiency voltage conversion on the motherboard to go from 12V to other voltage levels”.

The macro-benefits of data center optimization are very green-friendly, and not exclusive to Google. Any large corporate data center consumes a great amount of electricity, and outputs waste heat. Locating a data center near cheap power sources is sensible, and is presumably why Amazon and Google, amongst others have chosen the Pacific Northwest due to the region’s plentiful hydroelectric power.

The most pleasing idea was to recycle the heat and warm air generated by server rooms and data centers. The excess heat would be used for temperature control, to heat buildings, swimming pools or maybe industrial purposes. The article concludes with a real-world example; Telehouse, a U.K. colocation provider, specifically designed its London data center with an exhaust pipe to move heat to third parties in the Docklands.

see Exectweets, Unconventional Computing: The Future is Hot Air by David Kanter, May 2010.

Published in: on May 14, 2010 at 3:35 am  Leave a Comment  
Tags: , ,

Transitory nature of information technology

Are we losing the means to preserve an enduring research trail? The premise is that due to the multiple forms of communication between academics and developers, the steps leading to past scientific discovery and technological innovation will be lost.

Why is this re-creation, even documentation, so important?  First, for History of Science and secondly, for innovators to be, coming through the pipeline. Not-yet-arrived scientists will want to study the development process. Sometimes what appears to be a flash of inspiration is preceded by months, or years, of reading, analysis or experiments. Documentation is important for understanding creative research design. Relatively easy access to successful examples from the past is a necessity.

The pace of innovation is wonderfully fast. That is unequivocally good!


Representations like Alan Warburton‘s video, Format: A Brief History of Data Storage always makes me feel a frisson of delight, shiver of awe. It has great music too, Short Like Me by Beni (Kitsuné Maison).

More than information overload

Data deluge swamps science historians is an eyebrow-raising news story about the collected research materials of the world’s leading evolutionary biologist, William Hamilton, following his demise. This is more than a problem of information overload. When the British Library received Hamilton’s working papers, they were faced with assembling the contents of

  • 200 crates of handwritten letters, draft typescripts and lab notes,
  • 26 cartons of vintage floppy disks, reels of 9-track magnetic tape, and 80-column punch cards, but no devices that could read these archaic storage media

It was enough to convince me that we need better digital preservation and archival standards.