Growth of the early Internet, node by node

Network diagrams are a popular way of visualizing social and corporate relationships. Network theory has been used to model telecommunications performance and especially, the Internet. Communications networks increase in value as the number of connections increases. Metcalfe’s Law attempts to quantify the increased value.

Optimizing Metcalfe’s Law

For a network with n members, Metcalfe’s Law posits that the total value of that network is proportional to n * (n-1). Metcalfe’s Law as applied to the Internet, and even to the telephone network, is only valid if all connections have equal value. This is incorrect. Some internet connections are hardly used and contribute limited value. Of course, there are reasons to connect everyone that are not based on monetary value! Rural electrification is an example.

Andrew Odlyzko’s article about Metcalfe’s Law (IEEE Spectrum, 2006) was written with a keen awareness of the 2000 dotcom bubble. Odlyzko demonstrated how Metcalfe’s Law’s applicability could be limited by the equal value assumption, among others. I read it, and wondered: What is the Internet’s optimal number of nodes and connections? When did the value of a larger Internet network start diminishing?

At some point, ISPs (Internet Service Providers) stopped charging users for access, as the business of delivering content became more valuable than providing greater network connectivity. AOL charged for service until 2002 or so.

I thought it would be helpful to begin with a timeline of Internet growth, by number of sites connected and corresponding events, as a starting point for determining incremental value. I searched for a streamlined history, but the best that I could find is provided by The Computer History Museum, and it isn’t quite linear. It also has a lot of technical detail that isn’t relevant for verifying Metcalfe’s Law. I decided to construct a timeline of dates and nodes, from which connectivity can be determined. I am writing this partly for myself, for reference purposes. (I don’t know how to value connectivity, not yet.)

In the beginning

In the beginning, the Internet had only two nodes. It was called the ARPANET. (more…)

Published in: on November 25, 2016 at 6:41 am  Comments (3)  
Tags: , , , ,

Just a little bit more Bitcoin trouble

There has been so much tumult in bitcoin and cryptocurrency over the past few days! Interest and concern extends beyond online communities. Motives vary.

screenshot of bitcoin mining in Windows

Bitcoin miner GUI running Windows 7

Anonymous and decentralized

There are two conceptual pillars of trust that uphold bitcoin as being superior to fiat currency. (The fiat currency of reference is primarily the US dollar. Why? Because the US dollar is the world’s reserve currency, for now.)

The first is anonymity. US dollars held as cash are bearer instruments. Ownership and use is anonymous, but only until one wants to use them for commercial transactions of significant size as defined by anti-money laundering rules. Bitcoin does have some anonymity shortcomings, as transactions on the blockchain are actually pseudonymous, but there may be tractable remedies. Further details have been widely covered elsewhere.

The second conceptual pillar of bitcoin is decentralization. The US dollar is highly centralized. As ideological (but not market) confidence in the dollar diminishes, the appeal of an apolitical, alternative currency increases, especially one that is a fungible store of value.

All markets are game theoretic. Bitcoin is too.  I really wish we could ask Professor John Nash what he thinks of bitcoin! Nash wrote a pleasant, accessible article that described bitcoin-like currency, titled “Ideal Money” a few years ago.

I mention game theory because monopolists and cartels can assert control over bitcoin production. This is playing out right now.

Centralization of bitcoin

Currently, Bitcoin’s most acute concern is loss of decentralization. This is due to the documented, persistent existence of a 51% majority mining pool controlled by gHash.io. gHash is owned and operated by private entity cex.io. gHash’s market dominant behavior was noted in March 2014, however the situation was transient. That has since changed. (more…)

PDF history and something special from Adobe

Part One: PDF history 

PDF is a formal open standard, ISO 32000. It was invented by Adobe Systems 17 years ago.

PDF = Portable Document Format

PDF history by Adobe

History of the PDF by Adobe Systems

The image links to a pleasant interactive timeline of Adobe Systems and its role in the development of the PDF. The chronology is in Flash, and thankfully free of any video or audio. Read more about Adobe Systems role in the history of PDF file development.

PDF files are more versatile than I realized, and

  • are viewable and printable on Windows®, Mac OS, and mobile platforms e.g. Android™
  • can be digitally signed
  • preserve source file information — text, drawings, video, 3D, maps, full-color graphics, photos — regardless of the application used to create them

Additional PDF file types exist, including PDF/A, PDF/E and U3D. All are supported by Adobe software.  (more…)

Published in: on September 5, 2011 at 7:30 pm  Comments (3)  
Tags: , , , , , ,

U.S. Scientists top research fraud list? Concerned? Probably not.

I happened upon this story while reading Politics Daily’s[1] coverage of a Journal of Medical Ethics article about a study of retraction incidence for research papers. The article was published in November 2010. 

The study found that leading causes of invalid research were:

  • retraction due to discovery of lab error after article submission to peer-reviewed journals
  • inability to reproduce results

I see that as honest behavior. Which would be easier, trying conceal or deny a mistake, or admitting error? The latter couldn’t be easy.

Braver Path Dramatization: Researcher requests article retraction

Dear ACM or IEEE,

I am the author of that research article you featured in last month’s issue. You know, the paper that was covered by most of the scientific press and popular media because my findings had such wide-ranging implications?

Well, I just found a major error in my work as I was re-reading it today. None of the peer-reviewers caught it, nor did I, until now. Please issue a retraction in my name. I’ll return that $50,000 of prize money you awarded to me. And I’ll tell the research group at [ pick any of { IBM, Princeton Advance Studies, Google Labs, NIH, CDC, Stanford University, mongoDB, Betaworks, NVIDIA} ] who offered me that great new job based on my research, that I was wrong and understand if they rescind their offer of employment and funding….

Actually, I wish the article hadn’t used the word fraud at all, as it a study of retractions, only a small number of which were due to fraud. There were certainly some cases of outright, very predatory fraud, clearly motivated by greed. The article mentions that. But that was a small part of the total number of retracted papers. In fact, when considered in the context of relative and not absolute counts, the key finding was that the retraction rate in the U.S. was 1.64%, during a ten-year interval. This far surpasses quality standards for rate of failure in nearly every other industry.

The most troubling concerns were plagiarism and deliberate falsification. Cases of both were presented in the article. Source data was drawn from on-line medical research repository PubMed from 2000 – 2009.

The article covered some other trends. Fewer American and Japanese scientists are publishing as a percentage of the total number of publications than in the past. Other countries are now entering the ring. This doesn’t mean that the U.S.A. and Japan are in technological or academic decline! It means that researchers from other nations are gaining better access to education and research funding. That helps everyone.

Also, within the United States, research breakthroughs are becoming far less concentrated in the traditional bastions of Harvard, Stanford and University of Chicago. Duke, University of Kansas, University of Iowa, University of Southern Florida and other public and private institutions are coming their own, achieving prominence like never before.

1. Politics Daily is owned by America Online News.  AOL continues to produce quality content and services, despite the brand’s unfortunate lack of prestige and status.  AOL is much more than an outdated and unpleasant internet service provider, although that is my first thought when I see the triangular AOL logo.

Patent day for Apple

Apple received approval for two patents last week.

Logo Antenna

The Apple logo antenna may seem a bit peculiar.  I am uncertain which devices will use this invention. The range of possibilities mentioned in the PatentlyApple article, see below, included MacBooks, iPhones, wristwatches and pendants. Yes, “pendant” in the traditional meaning of the word, as in “type of jewelry worn suspended from a chain worn around the neck”.

Apple Inc

Apple Inc by Phil Bradley

What sort of pendant would need an antenna? I’ve heard mention of an Apple logo watch. It may have been a limited release item, intended for promotional purposes only.

Location sharing by PUSH

The location-sharing update patent at the end of the article is possibly of greater significance. It is PUSH-based. There is an excellent diagram which I couldn’t easily capture without a finesse-less frame.

I wanted to mention an aspect that was of particular interest, although my thoughts are merely conjecture. Recall that geo-locating is accomplished by maintaining a continuous background process running on the user’s device, even if not in active use. Apple’s PUSH based service renders this unnecessary. That’s obviously beneficial for conservation of battery life and power consumption, as touted by the patent.

Might another benefit be health-related? I refer to the possibly reduced exposure to RFD emissions for the user of a PUSH based geolocation technology. I wonder if it can be used for any wireless device, whether Apple or otherwise, be it smart phone or stupid phone?

via www.patentlyapple.com Noteworthy Patent Published Today:

Apple states that a location-sharing mobile device has to maintain an active background process regardless of whether other devices request such up-to-date information from the location information server. Apple’s patent details their very complicated solution to this problem.  Patent application 20100325194.

Published in: on December 26, 2010 at 8:42 pm  Leave a Comment  
Tags: , , , , , ,

Small Satellites Increase Access to Space

SRI International and NASA gave the final send-off to Radio Aurora Explorer (RAX) and its bevy of diminutive CubeSat satellites on November 19, 2010 as part of a space, weather, and atmospheric research project. The launch was accomplished with the assistance of an elderly Minotaur IV rocket by the Department of Defense’s (DOD) Space Test Program.

Genesat 1 small satellite

This is a CubeSat

The RAX mission goal is to improve understanding of intermittent and unpredictable distortion of earthly communications signals. Radio frequency and global positioning system (GPS) signals are adversely affected by upper atmosphere turbulence. This turbulence is like a whirlwind of ionizing activity, due to intense electrical currents that propagate from time-to-time through space. Solar wind storms, supposedly due to sunspot activity or coronal flares are ultimately responsible.

The fact that fluctuating levels of electrical activity in the upper atmosphere cause radio signal disruption is well-known. The disruption is intensely annoying for amateur radio operators such as myself, as I recall from my days as KA5JQF! It is annoying for GPS users, and can be a critical concern for navigation systems on Earth. With the data collection and experimental results from the RAX research, scientists will gain a better understanding of ionospheric turbulence. Near-space weather forecasting of sorts will finally become a reality.

Easier in near-space than at home

My analogy with terrestrial weather forecasting actually overstates the complexity! Predicting incidence and duration of radio signal disruptions, due to high solar activity and geomagnetic conditions, will probably be easier than meteorological forecasting of Earth’s very complicated weather systems.

Space weather prediction is challenging primarily because of the difficulty of collecting data, and corroborating cause and effect.

Ecumenical space missions

The RAX mission deserves special attention for another reason. It requires the collective participation of many astrophysicists, geophysicists, astronomers and graduate students from around the world. Radio telescopes and scientific radar installations in Alaska, Norway and Puerto Rico’s Arecibo are hubs for the research project team.

rocket launch photo

Minotaur rocket launching cube satellites into space – Image courtesy of NASA

I hope we’ll see a many more CubeSats in days to come, and for a variety of purposes. They were developed to increase both research and educational access to space. They are inexpensive, and lower the cost of space research. RAX is the first NSF satellite mission to be launched by the DOD .

Space missions using CubeSats don’t need vast infrastructure development and funding, unlike Apollo. Missions can be initiated by smaller, less wealthy countries and research institutions. CubeSat design timelines are very short, compared to traditional satellite technology. That’s why they’re particularly good for student involvement.

Published in: on December 8, 2010 at 11:30 pm  Comments (2)  
Tags: , ,

Economic Models for Turbulent Times Part I

Crust is an algorithm for reconstructing surfaces of any topology. In other words, it is a computational method for digitally rendering any 2-D shape, using data in three-dimensional space as input.

CSAIL buildings at MIT

Strange topology at MIT

Such methods garner a lot of attention these days. Here are a few reasons why: Graphical simulation models are increasingly needed for visualization and testing purposes in the field of particle physics. World of Warcraft and Second Life rely heavily on computationally intensive computer graphics, and scalable distributed systems. The U.S. economy is a highly complex system, partly guided by the results of mathematical models.

Crust was developed as a collaborative effort between two staff scientists at Xerox PARC and a researcher at MIT.

None of this happened recently. In fact, Crust hasn’t been semantically linked with the word “new” since its debut at the 1998 ACM SIGGRAPH Conference.

What is so special about Crust?

The Crust algorithm is special because it has certain features uncommon in most quantitative models, yet highly sought after.

First, Crust offers results with “provable” guarantees. Given a good sample from a smooth surface, Crust’s results are guaranteed. That is, Crust guarantees that its output is topologically correct, converging to the original surface with increasing faithfulness depending on the input data density.

Voronoi pig

Graphical computation with Crust: Voronoi Piggy

The third member of the Crust project team was Manolis Kamvysellis, a Ph.D. student at MIT. Manolis did much of the implementation and testing work—he wrote a short-form version, A New Voronoi Based Reconstruction Algorithm [PDF], of the original ACM journal publication. Happily, he had the good sense to demonstrate Crust with this fine pink pig! Let’s do the same.

Highly efficient porcine reconstruction in three dimensions

Recall that Crust’s criteria for acceptable sample size is determined dynamically . A single topological surface, such as Piggy, may have very detailed surfaces, with high data density. Observe this near Piggy’s ears and snout. Other areas like the hindquarters are quite featureless.

Crust dynamically adjusts its smallest acceptable sample size accordingly. Even minimally detailed surfaces such as Piggy’s lower hind quarters above the hooves can be reconstructed accurately.

To be continued…

Published in: on December 5, 2010 at 7:58 am  Comments (5)  
Tags: , , , , , ,

Supercomputer great success for China

China’s world-leading new supercomputer, Tianhe-1A, is built on a foundation of general-purpose computing and graphics processing units (GPUs).

… on Oct. 28, Tianhe-1A achieved a sustained performance of 2.507 petaflop (2.507 quadrillion calculations every second)… [It is configured] with 14,336 Xeon X5670 processors and 7,168 Nvidia Tesla M2050 general purpose GPUs.

The next fastest computer in the world is Jaguar, a Cray supercomputer at Oakridge National Labs in Tennessee, in the U.S.A. It’s peak performance is 1.75 petaflop. Tianhe-1A is 1.425 times faster than Jaguar.

China Designs and Builds Fastest Supercomputer in the World

Tianhe is the Fastest Supercomputer in the World

via Supercomputer Tianhe great success for China: says German expert.

Published in: on November 13, 2010 at 11:50 am  Comments (1)  
Tags: , ,

High-speed rail from China to California

Earlier this morning I was reading a surprisingly, pleasingly blunt BBC article, about California’s trade mission.  The actual title is Arnold Schwarzenegger sells California to East Asia! While visiting, Governor Schwarzenegger wanted to have a look at the latest high-speed passenger rail transportation technology.

The State of California, with its $19 billion deficit, is investigating public transportation alternatives used in other parts of the world. Japan is interested in contracting to build the trains and loaning California the money to pay for the work. China is too.

How high-speed rail came to China

China has the world’s longest high-speed rail line. However, the expertise to develop and build it was largely contributed by European and Asian countries with advanced technological skills in everything from control systems to laying tracks.

When the Japanese and European companies that pioneered high-speed rail agreed to build trains for China, they thought they’d be getting access to a booming new market, billions of dollars worth of contracts and the cachet of creating the most ambitious rapid rail system in history.  What they didn’t count on was having to compete with Chinese firms who adapted their technology and turned it against them just a few years later.
Train Makers Rail Against China’s High-Speed Designs

There will be some fascinating intellectual property issues should China decide to enter the high-speed rail market as a producer and exporter, given the origins of the technology.

China will also experience market-based challenges in the form of competition from countries such as South Korea, who has worked in a contractual arrangement with the EU’s high-speed TGV passenger rail. Both South Korea and Japan would be eager to work with U.S. government or government-funded entities, whether state of federal, in upgrading our nation’s passenger rail service.

California’s fascination with rail transit

California’s history with high-speed rail goes back to 1982, during the days of Governor Jerry Brown. With just a single law, Brown created a California High Speed Rail project and exempted it from California Environmental Quality Act rules. In 1996, the state legislature created a High-Speed Rail Authority. Last year, the California State Auditor expressed some concerns about the state’s High-Speed Rail Authority: “It Risks Delays or an Incomplete System Because of Inadequate Planning, Weak Oversight, and Lax Contract Management”.

Re-patriation initiative

Demanding transfer of advanced technology from foreign companies, in exchange for access to China’s vast domestic market, has become something of a Chinese national economic strategy.  Despite being forward-looking, China is already encountering challenges that come with a global race to the bottom.

Shanghai authorities have revealed that they are using a database of Chinese students studying abroad in a bid to attract top talent back to the city. The database is populated with information corresponding to Chinese students attending the world’s top 100 universities…
Student database used in Chinese “re-patriation” effort

Science Trading Card

Science Trading Card via Flickr

I was inspired by the story of this year’s Nobel Laureate in Physics, Professor A. Geim. He was an Ig-Noble prize winner a scant ten years ago, for the distinction of levitating live frogs with magnets. Now he and his University of Manchester colleague have won the Nobel Prize for extraction of graphene, a carbon-based material with amazing properties, including exceptional conductivity, transparency and strength.

Put aside the subject of this image, cigarettes. Note the style, humor and backdrop. It is delightfully whimsical, much in the spirit I attribute to Professor Geim as he levitated his frogs.
Image: Churchman’s Cigarettes, Early 20th Century originally uploaded by the Chemical Heritage Foundation

Published in: on October 11, 2010 at 10:21 am  Leave a Comment  
Tags: , ,