Cutting corners on telecom infrastructure with Huawei

In January 2013, I wrote a Quora blog post about Huawei’s twisty, winding path to prominence. There were plenty of oddities, e.g. Huawei was supplier to the Taliban, and later, was nearly acquired by GOP presidential Mitt Romney… but not at the same time!

Huawei is back in the limelight. Curiously, the problem is not one of Chinese state interference but of sloppy software development. I’ll get to that, but first, let’s take an illustrated tour of the Huawei story.

A casual Huawei timeline

2001 – Huawei India faces allegations that it had developed telecommunications equipment used by the Taliban in Afghanistan.

Huawei greeters at ITU World Telecom 2007 but probably not for the Taliban

2010 – Reuters reports that a partner of Huawei tried to sell embargoed Hewlett-Packard computer equipment to Iran‘s largest mobile-phone operator.

Huawei at mobile device trade show convention in Iran

2011 – The Australian government excludes Huawei from tendering contracts with a government-owned corporation constructing a broadband network.

2012 – The Canadian government excludes Huawei from plans to build a secure government communications network.

Huawei phone Pegasus, Barcelona 2012

2013 – The U.S.- China Economic & Security Review Commission advised Congress about Chinese government influence on Huawei.

2013 – Reuters investigative report following receipt of a letter from a concerned Los Alamos National Labs (LANL) employee:

[LANL] had installed devices made by H3C Technologies Co [which] raises questions about procurement practices by U.S. departments responsible for national security.

The devices were Chinese-made switches used for managing data traffic on LANL computer networks. Huawei’s relationship with Chinese military was mentioned.

(more…)

Published in: on 23 October 2019 at 5:48 am  Comments (2)  
Tags: , , , , , ,

Yet another academic plagiarism scandal: blockchains, medical research, and patents

One must be intrinsically motivated to be ethical and honest. Integrity cannot be imposed by peer review.

This is not another SFYL (sorry for your loss) tale of cryptocurrency scamming. That is merely a grace note. Academic plagiarism can happen, regardless of whether bitcoin, blockchains, or cryptocurrency are involved.

One’s own professional community, and the moral implications of having lied and plagiarized i.e. shame, should be enough to keep scientific and other original researchers (and investigatory work in general) honest. It clearly isn’t. I make that observation based on this passage via Andrew Gelman (emphasis mine): (more…)

Published in: on 16 November 2018 at 6:28 am  Comments (7)  
Tags: , , , , , ,

Growth of the early Internet, node by node

Network diagrams are a popular way of visualizing social and corporate relationships. Network theory has been used to model telecommunications performance and especially, the Internet. Communications networks increase in value as the number of connections increases. Metcalfe’s Law attempts to quantify the increased value.

Optimizing Metcalfe’s Law

For a network with n members, Metcalfe’s Law posits that the total value of that network is proportional to n * (n-1). Metcalfe’s Law as applied to the Internet, and even to the telephone network, is only valid if all connections have equal value. This is incorrect. Some internet connections are hardly used and contribute limited value. Of course, there are reasons to connect everyone that are not based on monetary value! Rural electrification is an example.

Andrew Odlyzko’s article about Metcalfe’s Law (IEEE Spectrum, 2006) was written with a keen awareness of the 2000 dotcom bubble. Odlyzko demonstrated how Metcalfe’s Law’s applicability could be limited by the equal value assumption, among others. I read it, and wondered: What is the Internet’s optimal number of nodes and connections? When did the value of a larger Internet network start diminishing?

At some point, ISPs (Internet Service Providers) stopped charging users for access, as the business of delivering content became more valuable than providing greater network connectivity. AOL charged for service until 2002 or so.

I thought it would be helpful to begin with a timeline of Internet growth, by number of sites connected and corresponding events, as a starting point for determining incremental value. I searched for a streamlined history, but the best that I could find is provided by The Computer History Museum, and it isn’t quite linear. It also has a lot of technical detail that isn’t relevant for verifying Metcalfe’s Law. I decided to construct a timeline of dates and nodes, from which connectivity can be determined. I am writing this partly for myself, for reference purposes. (I don’t know how to value connectivity, not yet.)

In the beginning

In the beginning, the Internet had only two nodes. It was called the ARPANET. (more…)

Published in: on 25 November 2016 at 6:41 am  Comments (10)  
Tags: , , , ,

Just a little bit more Bitcoin trouble

There has been so much tumult in bitcoin and cryptocurrency over the past few days! Interest and concern extends beyond online communities. Motives vary.

screenshot of bitcoin mining in Windows

Bitcoin miner GUI running Windows 7

Anonymous and decentralized

There are two conceptual pillars of trust that uphold bitcoin as being superior to fiat currency. (The fiat currency of reference is primarily the US dollar. Why? Because the US dollar is the world’s reserve currency, for now.)

The first is anonymity. US dollars held as cash are bearer instruments. Ownership and use is anonymous, but only until one wants to use them for commercial transactions of significant size as defined by anti-money laundering rules. Bitcoin does have some anonymity shortcomings, as transactions on the blockchain are actually pseudonymous, but there may be tractable remedies. Further details have been widely covered elsewhere.

The second conceptual pillar of bitcoin is decentralization. The US dollar is highly centralized. As ideological (but not market) confidence in the dollar diminishes, the appeal of an apolitical, alternative currency increases, especially one that is a fungible store of value.

All markets are game theoretic. Bitcoin is too.  I really wish we could ask Professor John Nash what he thinks of bitcoin! Nash wrote a pleasant, accessible article that described bitcoin-like currency, titled “Ideal Money” a few years ago.

I mention game theory because monopolists and cartels can assert control over bitcoin production. This is playing out right now.

Centralization of bitcoin

Currently, Bitcoin’s most acute concern is loss of decentralization. This is due to the documented, persistent existence of a 51% majority mining pool controlled by gHash.io. gHash is owned and operated by private entity cex.io. gHash’s market dominant behavior was noted in March 2014, however the situation was transient. That has since changed. (more…)

PDF history and something special from Adobe

Part One: PDF history 

PDF is a formal open standard, ISO 32000. It was invented by Adobe Systems 17 years ago.

PDF = Portable Document Format

PDF history by Adobe

History of the PDF by Adobe Systems

The image links to a pleasant interactive timeline of Adobe Systems and its role in the development of the PDF. The chronology is in Flash, and thankfully free of any video or audio. Read more about Adobe Systems role in the history of PDF file development.

PDF files are more versatile than I realized, and

  • are viewable and printable on Windows®, Mac OS, and mobile platforms e.g. Android™
  • can be digitally signed
  • preserve source file information — text, drawings, video, 3D, maps, full-color graphics, photos — regardless of the application used to create them

Additional PDF file types exist, including PDF/A, PDF/E and U3D. All are supported by Adobe software.  (more…)

Published in: on 5 September 2011 at 7:30 pm  Comments (3)  
Tags: , , , , , ,

U.S. Scientists top research fraud list? Concerned? Probably not.

I happened upon this story while reading Politics Daily’s[1] coverage of a Journal of Medical Ethics article about a study of retraction incidence for research papers. The article was published in November 2010. 

The study found that leading causes of invalid research were:

  • retraction due to discovery of lab error after article submission to peer-reviewed journals
  • inability to reproduce results

I see that as honest behavior. Which would be easier, trying conceal or deny a mistake, or admitting error? The latter couldn’t be easy.

Braver Path Dramatization: Researcher requests article retraction

Dear ACM or IEEE,

I am the author of that research article you featured in last month’s issue. You know, the paper that was covered by most of the scientific press and popular media because my findings had such wide-ranging implications?

Well, I just found a major error in my work as I was re-reading it today. None of the peer-reviewers caught it, nor did I, until now. Please issue a retraction in my name. I’ll return that $50,000 of prize money you awarded to me. And I’ll tell the research group at [ pick any of { IBM, Princeton Advance Studies, Google Labs, NIH, CDC, Stanford University, mongoDB, Betaworks, NVIDIA} ] who offered me that great new job based on my research, that I was wrong and understand if they rescind their offer of employment and funding….

Actually, I wish the article hadn’t used the word fraud at all, as it a study of retractions, only a small number of which were due to fraud. There were certainly some cases of outright, very predatory fraud, clearly motivated by greed. The article mentions that. But that was a small part of the total number of retracted papers. In fact, when considered in the context of relative and not absolute counts, the key finding was that the retraction rate in the U.S. was 1.64%, during a ten-year interval. This far surpasses quality standards for rate of failure in nearly every other industry.

The most troubling concerns were plagiarism and deliberate falsification. Cases of both were presented in the article. Source data was drawn from on-line medical research repository PubMed from 2000 – 2009.

The article covered some other trends. Fewer American and Japanese scientists are publishing as a percentage of the total number of publications than in the past. Other countries are now entering the ring. This doesn’t mean that the U.S.A. and Japan are in technological or academic decline! It means that researchers from other nations are gaining better access to education and research funding. That helps everyone.

Also, within the United States, research breakthroughs are becoming far less concentrated in the traditional bastions of Harvard, Stanford and University of Chicago. Duke, University of Kansas, University of Iowa, University of Southern Florida and other public and private institutions are coming their own, achieving prominence like never before.

1. Politics Daily is owned by America Online News.  AOL continues to produce quality content and services, despite the brand’s unfortunate lack of prestige and status.  AOL is much more than an outdated and unpleasant internet service provider, although that is my first thought when I see the triangular AOL logo.

Patent day for Apple

Apple received approval for two patents last week.

Logo Antenna

The Apple logo antenna may seem a bit peculiar.  I am uncertain which devices will use this invention. The range of possibilities mentioned in the PatentlyApple article, see below, included MacBooks, iPhones, wristwatches and pendants. Yes, “pendant” in the traditional meaning of the word, as in “type of jewelry worn suspended from a chain worn around the neck”.

Apple Inc

Apple Inc by Phil Bradley

What sort of pendant would need an antenna? I’ve heard mention of an Apple logo watch. It may have been a limited release item, intended for promotional purposes only.

Location sharing by PUSH

The location-sharing update patent at the end of the article is possibly of greater significance. It is PUSH-based. There is an excellent diagram which I couldn’t easily capture without a finesse-less frame.

I wanted to mention an aspect that was of particular interest, although my thoughts are merely conjecture. Recall that geo-locating is accomplished by maintaining a continuous background process running on the user’s device, even if not in active use. Apple’s PUSH based service renders this unnecessary. That’s obviously beneficial for conservation of battery life and power consumption, as touted by the patent.

Might another benefit be health-related? I refer to the possibly reduced exposure to RFD emissions for the user of a PUSH based geolocation technology. I wonder if it can be used for any wireless device, whether Apple or otherwise, be it smart phone or stupid phone?

via www.patentlyapple.com Noteworthy Patent Published Today:

Apple states that a location-sharing mobile device has to maintain an active background process regardless of whether other devices request such up-to-date information from the location information server. Apple’s patent details their very complicated solution to this problem.  Patent application 20100325194.

Published in: on 26 December 2010 at 8:42 pm  Leave a Comment  
Tags: , , , , , ,

Small Satellites Increase Access to Space

SRI International and NASA gave the final send-off to Radio Aurora Explorer (RAX) and its bevy of diminutive CubeSat satellites on November 19, 2010 as part of a space, weather, and atmospheric research project. The launch was accomplished with the assistance of an elderly Minotaur IV rocket by the Department of Defense’s (DOD) Space Test Program.

Genesat 1 small satellite

This is a CubeSat

The RAX mission goal is to improve understanding of intermittent and unpredictable distortion of earthly communications signals. Radio frequency and global positioning system (GPS) signals are adversely affected by upper atmosphere turbulence. This turbulence is like a whirlwind of ionizing activity, due to intense electrical currents that propagate from time-to-time through space. Solar wind storms, supposedly due to sunspot activity or coronal flares are ultimately responsible.

The fact that fluctuating levels of electrical activity in the upper atmosphere cause radio signal disruption is well-known. The disruption is intensely annoying for amateur radio operators such as myself, as I recall from my days as KA5JQF! It is annoying for GPS users, and can be a critical concern for navigation systems on Earth. With the data collection and experimental results from the RAX research, scientists will gain a better understanding of ionospheric turbulence. Near-space weather forecasting of sorts will finally become a reality.

Easier in near-space than at home

My analogy with terrestrial weather forecasting actually overstates the complexity! Predicting incidence and duration of radio signal disruptions, due to high solar activity and geomagnetic conditions, will probably be easier than meteorological forecasting of Earth’s very complicated weather systems.

Space weather prediction is challenging primarily because of the difficulty of collecting data, and corroborating cause and effect.

Ecumenical space missions

The RAX mission deserves special attention for another reason. It requires the collective participation of many astrophysicists, geophysicists, astronomers and graduate students from around the world. Radio telescopes and scientific radar installations in Alaska, Norway and Puerto Rico’s Arecibo are hubs for the research project team.

rocket launch photo

Minotaur rocket launching cube satellites into space – Image courtesy of NASA

I hope we’ll see a many more CubeSats in days to come, and for a variety of purposes. They were developed to increase both research and educational access to space. They are inexpensive, and lower the cost of space research. RAX is the first NSF satellite mission to be launched by the DOD .

Space missions using CubeSats don’t need vast infrastructure development and funding, unlike Apollo. Missions can be initiated by smaller, less wealthy countries and research institutions. CubeSat design timelines are very short, compared to traditional satellite technology. That’s why they’re particularly good for student involvement.

Published in: on 8 December 2010 at 11:30 pm  Comments (2)  
Tags: , ,

Economic Models for Turbulent Times Part I

Crust is an algorithm for reconstructing surfaces of any topology. In other words, it is a computational method for digitally rendering any 2-D shape, using data in three-dimensional space as input.

CSAIL buildings at MIT

Strange topology at MIT

Such methods garner a lot of attention these days. Here are a few reasons why: Graphical simulation models are increasingly needed for visualization and testing purposes in the field of particle physics. World of Warcraft and Second Life rely heavily on computationally intensive computer graphics, and scalable distributed systems. The U.S. economy is a highly complex system, partly guided by the results of mathematical models.

Crust was developed as a collaborative effort between two staff scientists at Xerox PARC and a researcher at MIT.

None of this happened recently. In fact, Crust hasn’t been semantically linked with the word “new” since its debut at the 1998 ACM SIGGRAPH Conference.

What is so special about Crust?

The Crust algorithm is special because it has certain features uncommon in most quantitative models, yet highly sought after.

First, Crust offers results with “provable” guarantees. Given a good sample from a smooth surface, Crust’s results are guaranteed. That is, Crust guarantees that its output is topologically correct, converging to the original surface with increasing faithfulness depending on the input data density.

Voronoi pig

Graphical computation with Crust: Voronoi Piggy

The third member of the Crust project team was Manolis Kamvysellis, a Ph.D. student at MIT. Manolis did much of the implementation and testing work—he wrote a short-form version, A New Voronoi Based Reconstruction Algorithm [PDF], of the original ACM journal publication. Happily, he had the good sense to demonstrate Crust with this fine pink pig! Let’s do the same.

Highly efficient porcine reconstruction in three dimensions

Recall that Crust’s criteria for acceptable sample size is determined dynamically . A single topological surface, such as Piggy, may have very detailed surfaces, with high data density. Observe this near Piggy’s ears and snout. Other areas like the hindquarters are quite featureless.

Crust dynamically adjusts its smallest acceptable sample size accordingly. Even minimally detailed surfaces such as Piggy’s lower hind quarters above the hooves can be reconstructed accurately.

To be continued…

Supercomputer great success for China

China’s world-leading new supercomputer, Tianhe-1A, is built on a foundation of general-purpose computing and graphics processing units (GPUs).

… on Oct. 28, Tianhe-1A achieved a sustained performance of 2.507 petaflop (2.507 quadrillion calculations every second)… [It is configured] with 14,336 Xeon X5670 processors and 7,168 Nvidia Tesla M2050 general purpose GPUs.

The next fastest computer in the world is Jaguar, a Cray supercomputer at Oakridge National Labs in Tennessee, in the U.S.A. It’s peak performance is 1.75 petaflop. Tianhe-1A is 1.425 times faster than Jaguar.

China Designs and Builds Fastest Supercomputer in the World

Tianhe is the Fastest Supercomputer in the World

via Supercomputer Tianhe great success for China: says German expert.

Published in: on 13 November 2010 at 11:50 am  Comments (1)  
Tags: , ,