Internet standards for HTML

The World Wide Web Consortium (W3C) is standardizing over 100 specifications for the open web, in at least 13 working groups. The CSS Working Group alone is in charge of 50 specifications. This does not include work on Unicode, HTTP and TLS.

http://tantek.com/2011/028/t5/standards-w3c-100-openweb-specs

New tag proposal.  Not really.

The nice thing about standards is that there are so many to choose from

I was waiting to post this until the debate between W3C and WHATWG about the status of HTML5 scope was resolved. However, I have waited since February 2011. Consensus is that HTML5 is being inappropriately used as a catch-all for every standard supported by modern browsers. Modern browsers actually include much more: CSS3 styling, WOFF (web fonts), semantic web elements such as microformats, 3-D graphics including SVG, and performance enhancements. HTML5 tags are merely one part of semantic web support. As a result, terminology was modified by WHATWG. HTML is the new HTML5(more…)

Published in: on November 15, 2011 at 4:25 am  Leave a Comment  
Tags: , , , , ,

PDF history and something special from Adobe

Part One: PDF history 

PDF is a formal open standard, ISO 32000. It was invented by Adobe Systems 17 years ago.

PDF = Portable Document Format

PDF history by Adobe

History of the PDF by Adobe Systems

The image links to a pleasant interactive timeline of Adobe Systems and its role in the development of the PDF. The chronology is in Flash, and thankfully free of any video or audio. Read more about Adobe Systems role in the history of PDF file development.

PDF files are more versatile than I realized, and

  • are viewable and printable on Windows®, Mac OS, and mobile platforms e.g. Android™
  • can be digitally signed
  • preserve source file information — text, drawings, video, 3D, maps, full-color graphics, photos — regardless of the application used to create them

Additional PDF file types exist, including PDF/A, PDF/E and U3D. All are supported by Adobe software.  (more…)

Published in: on September 5, 2011 at 7:30 pm  Comments (3)  
Tags: , , , , , ,

Radiation levels in Japan and the U.S.A.

UPDATE 13 April 2011: All links work in Part 1. Added a Part 2 for U.S, European radiation levels

Part One: Radiation levels in Japan

The source for this chart is Ryugo Hayano, Ph.D. Professor Hayano is the Physics Department Chair at The University of Tokyo. Click on the image to view a larger version, with higher resolution. It links directly to the Professor’s user page on image-sharing site Plixi. You’ll find many other charts and graphs there. Some charts are localized at a prefecture level.

Graph of Radiation

Graph of Radiation levels in Japan on 10 April 2011

I offer my thanks to @hayano and Daniel Garcia. Daniel R. Garcia Ph.D. is a nuclear scientist from France, doing a post-doc at TEPCO, in Fukushima. He was there prior to the earthquake and tsunami. Daniel frequently sends updates via Twitter as @daniel_garcia_r. He works at the reactor site every day, takes photos, and makes them available via Twitter.

Fukushima nuclear plant

Control board of Fukushima 1 Nuclear Power Plant when all was well

Both Daniel and Professor Hayano are reliable, because they never confuse Becquerel with Sievert with Roentgen, they know radio-isotopes and their half-lives better than nearly anyone. Daniel had to assisted the press a few weeks ago when there was confusion between Cesium 137 versus Iodine 137 versus Iodine 131 versus Uranium 137.

PART II: Other locales, other radiation levels

The Radiation Network is an excellent resource for radiation information in the U.S.A. and other parts of the world. It is a network of civilian volunteers using a protocol to report radiation readings, 24 hours a day, 7 days a week. Sensor stations are located throughout the contiguous 48 states, Hawaii,  Alaska and Norway. There was one in Northern Japan. Sadly, it went off-line last month.

The Radiation Network is non-profit, all volunteer and headquartered in Arizona. Tim is the public face of the Radiation Network. Using software developed for this purpose, Tim collects and aggregates the real-time data from the sensor stations, then updates the map online with the readings at one-minute intervals. The Radiation Network has went online nearly a decade, ago. Thus they offer very reliable baseline measures for comparison and detection of any incident. Their criteria for elevated radiation include

  • Rule out protocol for false positives e.g. spikes due to sensors  malfunctioning,
  • Level of radiation that is significant: Higher than the threshold AND sustained, and how long “sustained” is,
  • Exogenous causes such as geography. Readings in Colorado are always higher due to the higher elevation.

The site is basic but  functional. There are The Maps, and The Message. The Message is a running log of updates.

In addition to the embedded links above, you can read a little more about the Radiation Network in this little piece I wrote on Amplify on April 7.

Published in: on April 10, 2011 at 8:59 am  Leave a Comment  
Tags: , , , , , , , ,

Transitory nature of information technology

Are we losing the means to preserve an enduring research trail? The premise is that due to the multiple forms of communication between academics and developers, the steps leading to past scientific discovery and technological innovation will be lost.

Why is this re-creation, even documentation, so important?  First, for History of Science and secondly, for innovators to be, coming through the pipeline. Not-yet-arrived scientists will want to study the development process. Sometimes what appears to be a flash of inspiration is preceded by months, or years, of reading, analysis or experiments. Documentation is important for understanding creative research design. Relatively easy access to successful examples from the past is a necessity.

The pace of innovation is wonderfully fast. That is unequivocally good!


Representations like Alan Warburton‘s video, Format: A Brief History of Data Storage always makes me feel a frisson of delight, shiver of awe. It has great music too, Short Like Me by Beni (Kitsuné Maison).

More than information overload

Data deluge swamps science historians is an eyebrow-raising news story about the collected research materials of the world’s leading evolutionary biologist, William Hamilton, following his demise. This is more than a problem of information overload. When the British Library received Hamilton’s working papers, they were faced with assembling the contents of

  • 200 crates of handwritten letters, draft typescripts and lab notes,
  • 26 cartons of vintage floppy disks, reels of 9-track magnetic tape, and 80-column punch cards, but no devices that could read these archaic storage media

It was enough to convince me that we need better digital preservation and archival standards.

Follow

Get every new post delivered to your Inbox.

Join 58 other followers