Saturday, May 7, 2011

technology news


Expert: Skype for Mac hole can be used in remote attack



by

A security researcher said today that he found a serious hole in the Mac version of Skype that could be used by an attacker to remotely take control of someone else's computer.
In response, Skype says it released a "hotfix"--a quick fix to hold users over until a full update is ready--for the issue in a minor update released in mid-April, but did not prompt users to update their software because there were no reports that the hole was being exploited in the wild and it was planning on issuing another update early next week.
Gordon Maddern, of Pure Hacking in Australia, says he discovered the vulnerability about a month ago. He was chatting on Skype to a colleague about a payload when the payload executed in the colleague's Skype client accidentally, Maddern writes in a blog post today.
He created a proof of concept that can be used in an attack but is not releasing details on it until Skype fixes the issue. He could not find the vulnerability in the Skype client for Windows and Linux, he said.
Maddern said he contacted Luxembourg-based Skype and received a note saying "Thank you for showing an interest in Skype security, we are aware of this issue and will be addressing it in the next hotfix."
"That was over a month ago and there still has not been a fix released," he wrote in his blog post. "The long and the short of it is that an attacker needs only to send a victim a message and they can gain remote control of the victim's Mac. It is extremely wormable and dangerous."
In a blog post, Adrian Asher of Skype explains that the vulnerability "is related to a situation when a malicious contact would send a specifically crafted message that could cause Skype for Mac to crash. Note, this message would have to come from someone already in your Skype Contact List, as Skype's default privacy settings will not let you receive messages from people that you have not already authorized, hence the term malicious contact."
"At the time they (Pure Hacking) alerted us, we were already aware of the issue and were working on a fix to protect Skype users from this vulnerability, as we take our users' security very seriously," Asher wrote.
Updated 4:13 p.m. PT with Skype saying it previously issued a hotfix and will release an update that addresses the vulnerability next week.


Shop thousands of destination activities worldwide!



Elinor Mills covers Internet security and privacy. She joined CNET News in 2005 after working as a foreign correspondent for Reuters in Portugal and writing for The Industry Standard, the IDG News Service, and the Associated Press.

IPad 2 frenzy in China

by John Paczkowski, AllThingsD
AllThingsD
The iPad 2 debuted in China this morning to what is fast becoming a standard reception: massive lines and quick stock-outs.
That the device had been unofficially available on the market--through sellers who brought it into the country after buying the device overseas--did little to quell demand, which drove hundreds of hopeful buyers to queue overnight outside Apple's four stores in Beijing and Shanghai. "When we arrived here at around 4 a.m., there were already more than 500 people waiting," an Apple security guard at the company's downtown Beijing store told Xinhua. "The crowd rose to some 1,000 people when the store opened."
Sales began promptly at 8 a.m., the first retail stock-out was reported about four hours later, and by Friday afternoon the iPad shipping estimates at Apple's Chinese Online Store had gone from "1-2 weeks" to "No Supply."
So a very strong first-day showing for the iPad 2 in China, and one that suggests Apple's decision to make China top priority is paying off in a very big way. According to Analysys International, Apple was able to claim a 78 percent share of China's tablet market with the first iPad. How much more will it claim now, given the staggering response to the launch of the second?
(Credit: M.I.C. Gadget)
Story Copyright (c) 2011 AllThingsD. All rights reserved.


Additional stories from AllThingsD

  1. Key Developer Joe Hewitt Leaves Facebook
  2. Exclusive: Sony Considers Offering Reward To Help Catch Hackers
  3. Zynga Document Discloses Major Round of Financing in the Works
  4. IPad 2 Frenzy in China

Analysts' takes: Apple going ARM on MacBooks?


Future MacBooks running the same ARM chips that populate the iPad and iPhone?
Future MacBooks running the same ARM chips that populate the iPad and iPhone?
(Credit: Apple)
The rumor that Apple will drop Intel chips and move future MacBooks to the same kind of silicon that powers Apple's iPhone and iPad has got analysts pondering the prospect. Here are a few reactions.
As a preface to the comments below, one analyst cited Microsoft's announcement that Windows 8 will not run exclusively on Intel chips but also on ARM--the same chip architecture that powers Apple's iPhone and iPad. So, in a way, Microsoft is already on record with a transition to ARM.
Smart move for Apple vis-a-vis its developers: "This would be, in part, an ecosystem building opportunity. It would be saying to developers that Apple has the opportunity to increase the size of the TAM (Total Available Market) for developers to write for, while also changing the face of computing by bringing key characteristics such as instant-on and long battery life to the notebook clamshell form factor." --Richard Shim, analyst, DisplaySearch.
Apple has switched architectures before but...: "Apple has switched architectures in the past, so it is certainly possible they could switch to ARM. I don't see why they would do it, though. Even with a 64-bit architecture, ARM processors will not offer performance competitive with the high end of Intel's line, so Apple might be sacrificing all of its professional users. ARM may offer some battery life and cost benefits for mainstream laptops, but given that Intel is focusing on these parameters, I don't think the benefits would be sizable. Also, as indicated by its recent 22 [nanometer] announcement, Intel has a manufacturing technology advantage that will prevent ARM from getting very far ahead, if at all. So I am skeptical." --Linley Gwennap, principal analyst, The Linley Group.
It's just a matter of time: "Apple likes vertical integration, has proven ability to migrate software among instruction sets, and can derive adequate performance from non-Intel CPUs. Thus, I think it's only a matter of time before we see Apple computers with keyboards using ARM CPUs. I agree...that it makes sense to wait for the 64-bit ARM instruction set to break cover. My guess is that they'll use a homegrown CPU out of the chute. They've had CPU-development capability long enough in house to have something ready in 2012." --Joseph Byrne, The Linley Group.
Performance, performance, performance: "The concern is performance. Who knows for sure by 2013 what ARM will have? But Intel's 22-nanometer chips will be widely available by then. That will make it tough for other people to compete on a raw performance basis. You can offset by saying we're at the point where there's good-enough computing [so] we don't need that performance. But that's a hard argument to accept because we've said that for years. And yet people keep wanting to buy faster and faster PCs. Oh, and by the way, new software soaks up any extra CPU cycles. That said, over the years [Apple has] done two huge instruction set transitions and they've done them very successfully. So, it's not out of the realm of possibility--in order to give [Apple] a single instruction set in a combined platform. And they could do it in phases, where the MacBook Air stuff goes to iOS and ARM and they keep the higher-end stuff on Intel." --Bob O'Donnell, analyst, IDC
The risk factor: "Has Apple beefed up its chip team? I don't think they have. Besides, silicon is not their forte. I think it would be a strategic mistake. Intel can offer them extremely competitive products, leading-edge process technology, and throwaway prices. So, what's the advantage? There's going to be more risk than upside. If they misexecute on a product line, then the entire product strategy is at risk. And the price-premium argument completely goes away." --Ashok Kumar, analyst, Rodman and Renshaw

Ringbow: A new way to click a touch screen



The Ringbow is a wireless accessory controller designed for touch-screen devices.
(Credit: Rafe Needleman/CNET)
At a California Israel Chamber of Commerce demo event yesterday, I got a walk-through of an unusual and, as-pitched, probably hopeless idea for improving the interface of touch-screen devices: The Ringbow, a ring-mounted, wireless pointing stick.
The Ringbow does solve a problem in an elegant way. Touch-screen apps generally have only limited ways to control them, so access to menu commands or secondary functions requires trips to full menus, which slows down the user. The Ringbow is a finger-mounted five-way controller (four compass directions plus pushing down) that makes blasting through accessory menus faster than it would be in most apps.

Also at CICC: Fellowup, the Grandma-approved contact manager
In a demo (see video; note that the wire is for an extra battery pack the prototype device requires), selecting drawing submenu options (color picker, line weight chooser, pen type), and then making selections in those submenus, was much faster than it would otherwise be. Ringbow CEO Efrat Barit proposes that software vendors who make complex graphical apps (such as Adobe) could make their products easier and faster to use for professionals by adding Ringbow shortcuts.
There are also benefits in games, where a ring-mounted controller adds a lot of control options that one otherwise doesn't have in a touch-screen device.

Adding functionality to touch screens

However, I am skeptical that developers will pick up on this new concept in great numbers, despite Barit's statement that mobile apps developers are "excited" about the technology. There's just this huge chicken-and-egg problem here: You don't want to develop for an accessory nobody has, and nobody's going to buy a Ringbow without software that uses it.
I did use a Ringbow and can confirm that it does indeed make a touch-screen application's user interface faster (at least it did in the demo I tried), but clever developers could add new modalities to their apps without requiring new hardware. Multitouch concepts can be used for direct access to menu options; already users are familiar with "pinch" and "rotate" gestures, and OS X users are accustomed to a two-finger tap on a trackpad as the equivalent of a right-click on a mouse. Other multitouch gestures could be added to the touch lexicon for other functions. An expanded multitouch interface might not be as fast as a Ringbow, but at least developers won't have to worry about users who don't have the device.
Rather than the Ringbow getting traction in consumer apps and games on touch-screen devices, I see it being used in other specialized environments. It'd be great as a secondary controller in military and service vehicles, and arguably very useful for people who otherwise have their hands full but need access to technology--in medicine, perhaps. I would not bet against this technology being taken up in military and industrial applications, but it's too early in the history of touch-screen devices to say they need this kind of hardware to make them more usable.


Mozilla fights DHS over anti-MPAA, RIAA utility

No judge has ever declared a Firefox plug-in called MafiaaFire Redirector to be illegal. But that didn't stop the U.S. Department of Homeland Security from trying to censor it from the Web.
The Mozilla Foundation says DHS requested the removal of MafiaaFire, which describes itself as a utility that "automatically redirects you to the correct alternate site" if the main domain has been seized by the U.S. government.
Harvey Anderson, Mozilla's general counsel, told CNET today that the request from DHS was made over the phone. Anderson replied in writing, posing a list of questions in an April 19 e-mail, including this important one: "Is Mozilla legally obligated to disable the add-on?"
Anderson says DHS hasn't replied, and the plug-in has not been removed.
A DHS spokesman told CNET this afternoon that "ICE's Homeland Security Investigations does not comment publicly on our interaction with Internet intermediaries on intellectual property theft enforcement issues." ICE stands for the Immigration and Customs Enforcement division.
The reason DHS doesn't like the MafiaaFire plugin is obvious: It makes the government's tactic of seizing domain names less useful. FirstRow.net, Atdhe.net, and Torrent-Finder.com are among the domains seized on grounds that they're allegedly infringing copyrights of U.S. companies.
One response to a domain name seizure is, simply, to move to a new one, preferably in a top-level domain that can't be easily reached by DHS and the U.S. judicial system. That's what the popular sports video-streaming Web site, Atdhe.net, did after its domain went offline. It's now at Atdhenet.tv (and, just in case, Atdhe.me as well).
MafiaaFire helps to make this process a little easier by redirecting Firefox automatically to the replacement Web site. Its unflattering name arose out of a protest against the RIAA and MPAA--aka "the Music and Film Industry Association of America"--and the "mad-with-power ICE."
If a government official applies pressure on a private company to delete a file or document, that can raise constitutional and free speech issues. In the 1963 case known as Bantam Books v. Sullivan, the U.S. Supreme Court ruled that a commission's extra-judicial notification that some books or magazines were objectionable was an illegal "system of informal censorship."
"Whether the add-on is unlawful, or whether any speech is unlawful, is for the courts to determine, not for DHS to determine," says Aden Fine, staff attorney with the ACLU's Speech, Privacy and Technology Project. "Nobody from DHS should be going around trying to get speech removed from the Internet before a court decides."

Exclusive: Third attack against Sony planned

A group of hackers says it is planning another wave of cyberattacks against Sony in retaliation for its handling of the PlayStation Network breach.
An observer of the Internet Relay Chat channel used by the hackers told CNET today that a third major attack is planned this weekend against Sony's Web site. The people involved plan to publicize all or some of the information they are able to copy from Sony's servers, which could include customer names, credit card numbers, and addresses, according to the source. The hackers claim they currently have access to some of Sony's servers.
Should the planned attack succeed, it would be the latest blow in a series of devastating security breaches of Sony's servers over the past month. The failure of Sony's server security has ignited investigations by the FBI, the Department of Justice, Congress, and the New York State Attorney General, a well as data security and privacy authorities in the U.K., Canada, and Taiwan.
Several weeks ago the hacker group known as Anonymous targeted several Sony Web sites, including Sony.com and SonyStyle.com, with a distributed denial-of-service (DDoS) attack in retaliation for what its members saw as Sony's unfair legal action against hacker George Hotz. Two weeks ago Sony's PlayStation Network, along with its Qriocity service and Sony Online, were the target of an attack that exposed the personal information of more than 100 million Sony customers. Sony was forced to shut down PSN, Qriocity, and Sony Online, and is currently working to bring them back online after rebuilding the security of its servers.
Sony says it doesn't know who orchestrated what it's calling a "highly sophisticated, planned" attack, but it has dropped hints that the group Anonymous is involved. Kazuo Hirai, chairman of Sony Computer Entertainment, told a Congressional subcommittee in a letter yesterday that the intruders on its servers planted a file named "Anonymous" containing the statement "We are Legion," part of the group's tagline.
Anonymous issued a statement yesterday denying it was involved in the PSN breach. "While we are a distributed and decentralized group, our 'leadership' does not condone credit card theft," the statement said.
Now it seems the same group of hackers that was able to infiltrate the PSN servers is planning to hit back against Sony.
Sony did not immediately respond to a request for comment.

NASA delays Endeavour launch until at least May 16

NASA said today that it has decided to push back the final launch of the space shuttle Endeavour until at least May 16. This is the third delay since the shuttle's April 29 launch was scrubbed due to problems with its hydraulic systems.

The shuttle Endeavour atop pad 39A shortly after a launch scrub was announced on April 29. NASA said today it would delay the launch until at least May 16.
(Credit: NASA TV)
In a release, NASA said that Endeavour will launch no earlier than May 16. After the April 29 scrubbing, the agency targeted May 2, then May 8, and now mid-May at the earliest. NASA managers have got to be worried that each subsequent delay is threatening the space shuttle program's last-ever launch, that of Atlantis, which is currently slated for June 28.

NASA said it will hold a press conference Monday to update the public on the status of repairs to Endeavour's hydraulic systems. "Kennedy [Space Center] technicians are continuing work to resolve an issue in a heater circuit associated with Endeavour's hydraulic system that resulted in the [April 29] launch postponement," NASA said in a release today. "Technicians determined the failure was inside an aft load control assembly, which is a switchbox in the shuttle's aft compartment, and possibly its associated electrical wiring."
The agency acknowledged that it has yet to uncover the underlying cause of the switchbox failure, but said its technicians are substituting hardware that might have been the problem. "This weekend, technicians will install and check out new wiring that bypasses the suspect electrical wiring connecting the switchbox to the heaters," the release stated. "They will also run the heaters for up to 30 minutes to verify they are working properly and complete retesting of the other systems powered by the switchbox."
While NASA is currently targeting May 16 as the earliest possible date for launching Endeavour on its final mission, the agency said that there are launch opportunities available until May 26. It did not address what would happen if Endeavour cannot be launched until after May 26, but presumably that would mean that the Atlantis mission would have to be pushed back.

Seinfeld launches a Web site for 10-year-olds

Not everyone finds Jerry Seinfeld funny. And those who do find him funny don't find all of him funny.
Seinfeld seems to understand this and has decided to use the Web in order to portion out his finer moments.
This week, he launched JerrySeinfeld.com, a place that says it's a personal archive. But it isn't a site that hosts every single skit, show and stand-up that Seinfeld performed.
Instead, it's a site for 10-year-olds.
Why, suddenly, would Seinfeld be interested in young kids? For this, you merely have to click on the "What is This?" section of the site.
"When I was 10 years old, I started watching stand-up comedians on TV. I fell in love with them and I'm just as fascinated with stand-up comedy today," he says.

Seinfeld in his bow tie period.
(Credit: CC Alan Light/Flickr)
So, with this site, he's releasing three little bits of himself every day. But he's only releasing the bits that still amuse him. In addition, he says that he will be posting new things he's doing.
But, at heart, it really is for the young, the excitable, those who still bathe in the wonderment of what might be, rather than the nostalgia of what has gone.
"Somewhere out there are 10-year-olds just waiting to get hooked on this strange pursuit," Seinfeld says on the site. "This is for them. I'm just hoping somehow it will keep this silliness going."
Silliness is something that needs to be curated. I am, frankly, astonished that there isn't a National Society for the Protection of Silliness. Especially in the US of A.
Indeed, Seinfeld--a strange, populist sort of Kafka--has resolutely defended the sillinesses of life against assaults from all sorts of hideous sectors and lobbies.
How reassuring that he has chosen to release consistent spurts of silliness onto a Web that is mired in the hacking and shellacking of others.
And no one understands silliness better than 10-year-olds.



Volkswagen to produce a range of plug-ins in 2013

The 260 mpg Volkswagen XL1 concept car.
The 260 mpg Volkswagen XL1 concept car.
(Credit: Volkswagen)
Volkswagen jumped on the plug-in bandwagon and will produce a range of plug-in hybrid vehicles starting in 2013. Volkswagen CEO Martin Winterkorn made the announcement at the Vienna Motor Symposium this week, but he didn't specify which vehicles will get the plug-in power-train.
"The plug-in hybrid offers precisely what many customers expect: unlimited internal combustion engine performance combined with attractive electric mobility ranges in everyday driving," Winterkorn said. The company acknowledges that electric vehicles will play a large role in the automotive future, but finds plug-ins to be a happy medium until infrastructure, technology, and consumers make pure EVs a viable option.
Earlier this year, the German auto manufacturer debuted the XL1 concept plug-in car at an auto show in Qatar. Based on the L1 concept, the gullwinged tandem two-seater has a carbon fiber chassis and is powered by a 48-horsepower two-cylinder TDI engine and an electric motor that produces 27 hp. The lightweight aerodynamic vehicle reportedly achieves up to 260 mpg.
VW will begin limited series production of 100 XL1s in 2013, said Winterkorn in an interview with Automotive New Europe. Germany will be the first country to receive the XL1, followed by the U.S. and China at a later date.

SF shelves cell phone radiation ordinance


San Francisco officials have indefinitely delayed implementation of the city's Right to Know ordinance, which would have required retailers to display a phone's Specific Absorption Rate (SAR) at the point of sale and distribute materials educating consumers on cell phone radiation. A revamped version of the legislation is likely to be introduced in its place, but no further details have been announced.
First passed last June, the ordinance (PDF) quickly prompted a lawsuit from the wireless industry's lobbying arm, the CTIA. In addition to claiming that the law was unconstitutional because only the FCC and FDA have oversight over radio frequency emissions, the CTIA contended that the SAR provision was misleading to consumers and that it infringed on the First Amendment rights of retailers.
As a result of the lawsuit, the San Francisco Board of Supervisors delayed the ordinance's implementation date several times--most recently to June 15--and held two closed door meetings with City Attorney Dennis Herrera's office to discuss the issue. Board members wouldn't tell CNET what transpired during the meetings nor would they comment on the CTIA's warning that the city would be stuck with its legal fees if the trade group won the lawsuit.
It's clear, however, that the city isn't backing down completely. Supervisor John Avalos, who voted for the measure last year, could introduce amended legislation as early as next week. Though Frances Hsieh, one of Avalos' legislative aides, wouldn't discuss specifics, it's expected that any amendment will remove the SAR provisions.
"We're working with the Mayor's Office, the City Attorney's Office, and advocacy groups to vet out the specifics," she said. "We want a solid set of amendments when we introduce them."
Ellen Marks, the director of government and public affairs for the Environmental Health Trust, supported the original legislation. Marks said she's fine if the SAR provision is removed from the ordinance, but she'd like to see something similar to a new California State Senate bill that would require a radio-frequency warning label on phones and product packaging.
The ordinance "is only a temporary hold," she told CNET. "I have positive feelings that the city will stand up to this frivolous lawsuit and move forward with amendments."

Read more: http://www.cnet.com/8301-17918_1-20060548-85.html#ixzz1LexR2XnJ

Google help wanted: Antitrust lawyer


It's the confluence of two phenomena: Google is on a hiring binge and the company is increasingly under regulators' antitrust microscope. So the search giant is looking to hire a new antitrust lawyer.
The company posted a help wanted ad on LinkedIn looking for a "Competition Counsel" at their Mountain View, Calif., headquarters. The job posting was first reported by Bloomberg.
The posting describes the role as one that both helps guide product development as well as participate in legal matters. "You'll be willing and able to work on a variety of competition matters including antitrust litigation and regulatory proceedings. You must be well suited to providing internal counseling on a wide variety of projects and business practices," according to the posting.
Regulators have ratcheted up their antitrust probes of Google as the company grows and reaches into new markets. Last month, the company signed a consent decree with the Department of Justice to secure approval for the company's $700 million deal to buy travel technology provider ITA Software. Citing antitrust concerns, a federal judge in March rejected a settlement the company struck with authors and publishers in an effort to digitize every book ever published. And in November, the European Commission opened an investigation into complaints that it was skewing search results against rivals.
Google has said, as recently as its quarterly earnings call last month, that 2011 will be the biggest hiring year in the 24,400-employee company's history.

Read more: http://news.cnet.com/8301-1023_3-20060572-93.html#ixzz1Lexp1jqf

DARPA seeks help for interstellar starship

DARPA wants to go to the stars.
Yesterday, the Defense Advanced Research Projects Agency issued a call for concepts for a 100-year starship study program. The idea? To motivate research that could potentially "develop a viable and sustainable model for persistent, long-term, private-sector investment into the myriad of disciplines needed to make interstellar space travel practicable and feasible."

This, one can imagine, is the kind of feasibility study that would have been necessary decades ahead of time if the starships at the center of shows like "Star Trek," "Babylon 5," and "Deep Space 9" had really existed.
DARPA may be peopled with dreamers, but it also has a pretty impressive track record. Its predecessor, ARPA, played a central role in the creation of the Internet, and among many other accomplishments, DARPA researchers helped inspire autonomous cars via the agency's DARPA Grand Challenge, and they helped bring about stealth-fighter technology.
DARPA did not respond to a request for comment.
So while some are certainly going to scoff at the notion of a 100-year project (PDF file) to explore interstellar space, DARPA's ambitions should not be taken lightly. Particularly given some of the reasons behind the would-be project, and the steady decline in America's development of young engineers, mathematicians, and technologists.
"The genesis of the 100 Year Starship Study is to foster a rebirth of a sense of wonder among students, academia, industry, researchers, and the general population to consider 'why not,'" DARPA wrote in its request for information, "and to encourage them to tackle whole new classes of research and development related to all the issues surrounding long-duration, long-distance spaceflight. DARPA contends that the useful, unanticipated consequences of such research will have benefit to the Department of Defense and to NASA, and well as the private and commercial sector."
But because today's financial realities preclude the massive amount of investment that would be required to undertake a very long-term project like the development of a starship, DARPA is understandably turning to outside interests to begin the work. The agency said is is looking for "ideas for an organization, business model, and approach appropriate for a self-sustaining investment vehicle. The respondent must focus on flexible yet robust mechanisms by which an endowment can be created and sustained, wholly devoid of government subsidy or control, and by which worthwhile undertakings--in the sciences, engineering, humanities, or the arts--may be awarded in pursuit of the vision of interstellar flight."
This calls to mind, of course, large-scale competitions like those put on by the X Prize Foundation. On the other hand, the phrase "wholly devoid of government subsidy" would seem to prohibit the offering of a substantial prize to someone deemed successful at answering DARPA's requirements. It did say that it expects to offer someone not more than several hundred thousand dollars in start-up expenses in order to meet its requirements.
In particular, those requirements include: "Long-term survivability over a century-long time horizon;" "Self-governance, independent of government participation or oversight;" "Self-sustainment, independent of government funding;" and "Relevance to the goal of moving humanity toward the goal of interstellar travel, including related technological, biological, social, economic, and other issues."
These are grand goals, and it's hard to imagine anyone reading these words being alive to see the conclusion of a project like this. Yet without such ambitions, our society would almost certainly lose the benefits that could come from the realization of such goals, benefits that come from the spread of government-sponsored technology to educational institutions and private industry--and from the wonder such projects inspire in people young and old. This may be wishful thinking on DARPA's part, but how can we not wish to go along for the ride?

Thursday, April 28, 2011

Digging Deeper, Seeing Farther: Supercomputers Alter Science




Digging Deeper, Seeing Farther: Supercomputers Alter Science

SAN FRANCISCO — Inside a darkened theater a viewer floats in a redwood forest displayed with Imax-like clarity on a cavernous overhead screen.
Jim Wilson/The New York Times
DUET "Life: A Cosmic Journey" relies not just on computer animation techniques, but on a wealth of digitized scientific data as well.

Life: A Cosmic Story

A video trailer for the new planetarium show at the California Academy of Sciences.
Jim Wilson/The New York Times
 The visualization studio in the basement beneath the planetarium.
The hovering sensation gives way to vertigo as the camera dives deeper into the forest, approaches a branch of a giant redwood tree, and then plunges first into a single leaf and then into an individual cell. Inside the cell the scene is evocative of the 1966 science fiction movie “Fantastic Voyage,” in which Lilliputian humans in a minuscule capsule take a medical journey through a human body.
There is an important difference — “Life: A Cosmic Journey,” a multimedia presentation now showing at the new Morrison Planetarium here at the California Academy of Sciences, relies not just on computer animation techniques, but on a wealth of digitized scientific data as well.
The planetarium show is a visually spectacular demonstration of the way computer power is transforming the sciences, giving scientists tools as important to current research as the microscope and telescope were to earlier scientists. Their use accompanies a fundamental change in the material that scientists study. Individual specimens, whether fossils, living organisms or cells, were once the substrate of discovery. Now, to an ever greater extent, researchers work with immense collections of digital data, and the mastery of such mountains of information depends on computing power.
The physical technology of scientific research is still here — the new electron microscopes, the telescopes, the particle colliders — but they are now inseparable from computing power, and it is the computers that let scientists find order and patterns in the raw information that the physical tools gather.
Computer power not only aids research, it defines the nature of that research: what can be studied, what new questions can be asked, and answered.
“The profound thing is that today all scientific instruments have computing intelligence inside, and that’s a huge change,” said Larry Smarr, an astrophysicist who is director of the California Institute for Telecommunications and Information Technology, or Calit2, a research consortium at the University California, San Diego.

In the planetarium’s first production, “Fragile Planet,” the viewer was transported through the roof of the Morrison, first appearing to fly in a graceful arc around the Renzo Piano-designed museum and then quickly out into the solar system to explore the cosmos. Where visual imagery was once projected on the dome of the original Morrison Planetarium using an elaborate home-brew star projector, the new system is powered by three separate parallel computing systems which store so much data that the system is both telescope and microscope. From incomprehensibly small to unimaginably large, the computerized planetarium moves seamlessly over 12 orders of magnitude in the objects it presents. It can shift “from subatomic to the large-scale structure of the universe,” said Ryan Wyatt, an astronomer who is director of the planetarium.
It is, said Katy Börner, an Indiana University computer scientist who is a specialist in scientific visualization, a “macroscope.” She uses the word to describe a new class of computer-based scientific instruments to which the new planetarium’s virtual and physical machine belongs. These are composite tools, with different kinds of physical presences that have such powerful and flexible software programs that they become a complete scientific workbench that can be reconfigured by mixing and matching aspects of the software to tackle specific research problems.
The planetarium’s macroscope is designed for education, but it could be used for research. Like any macroscope, its essence is its capacity for approaching huge databases in a variety of ways. “Macroscopes provide a ‘vision of the whole,’ ” Dr. Börner wrote in the March issue of The Communications of the Association for Computing Machinery, “helping us ‘synthesize’ the related elements and detect patterns, trends and outliers while granting access to myriad details.’ ” She said software-based scientific instruments are making it possible to uncover phenomena and processes that in the past have been, “too great, slow or complex for the human eye and mind to notice and comprehend.”
Computing is reshaping scientific research in a number of ways, Dr. Börner notes. For example, independent scientists have increasingly given way to research teams as cited by scientific papers in the field of high-energy physics that routinely have hundreds or even thousands of authors. It is unsurprising, in a way, since the Web was invented as a collaboration tool for the high-energy physics community at CERN, the European nuclear research laboratory, in the early 1990s. As a result research teams in all scientific disciplines are increasingly both interdisciplinary and widely distributed geographically.
So-called Web 2.0 software, with its seamless linking of applications, has made it easier to share research findings, and that in turn has led to an explosion of collaborative efforts. It has also accelerated the range of cross-disciplinary projects as it has become easier to repurpose and combine software-based techniques ranging from analytical tools to utilities for exporting and importing data.
A macroscope need not be in a single physical location. To take one example, a midday visitor to the lab of Tom DeFanti, a computer graphics specialist, in the Calit2 building in San Diego is greeted by a wall-size array of screens that appears to offer a high-resolution window into a vacant laboratory somewhere else in the world. The distant room is a parallel laboratory at King Abdullah University of Science and Technology, in Thuwal, Saudi Arabia. Four years ago representatives of that university visited Calit2 and initiated a collaboration in which the American scientists helped create a parallel scientific visualization center in Thuwal connected to the Internet by up to 10 gigabits of bandwidth — enough to share high-resolution imagery and research.
Saudi researchers now have access to a software system known as Scalable Adaptive Graphics Environment, or SAGE, originally developed to permit scientists working far apart to share and visualize research data. SAGE is essentially an operating system for visual information, capable of displaying and manipulating images up to about one-third of a billion pixels — as much as 150 times more than what can be displayed on a conventional computer display.
Jim Wilson/The New York Times
SEAMLESS From a basement bank of servers, Cheryl Vanderbilt guides the planetarium display through over 12 orders of magnitude in the object it presents.

Life: A Cosmic Story

A video trailer for the new planetarium show at the California Academy of Sciences.
Jim Wilson/The New York Times
MACROSCOPE A long line last week at the planetarium waited to see a show powered by a whole new class of computer-based scientific instruments.

“The killer application is collaboration; that is what people want,” Dr. DeFanti said. “You can save so much energy by not flying to London that it will run a rack of computers for a year.”
More than a decade ago Dr. Smarr began building a distributed supercomputing capability he called the OptIPuter, because it used the fiber-optic links among the nation’s supercomputer centers to make it possible to divide computing problems as well as digital data so that larger scientific computing loads could be shared.
The advent of high-performance computing systems, however, created a new bottleneck for scientists, he said. “Over the past decade computers have become over a thousand times faster because of Moore’s Law and the ability to store information has gone up roughly 10,000 times, while the number of pixels we can display is maybe only a factor of two different,” he said.
To make it possible for visualization to catch up with accelerating computing capacity, researchers at Calit2 and others have begun designing display systems called OptIPortals that offer better ways of representing scientific data.
Recently, the Calit2 researchers have begun building scaled-down versions called OptIPortables, which are smaller display systems that can be fashioned like Lego blocks from just a handful of displays, rather than dozens or hundreds. The OptIPortable displays can be quickly set up and moved, and Dr. DeFanti said his lab was now at capacity assembling systems for research groups around the world.
Within many scientific fields software-based instruments are quickly adding new functions as open-source systems make it possible for small groups or even individuals to add features that permit customization.
Cytoscape is a bioinformatics software tool set that evolved, beginning in 2001, from research in the laboratory of Leroy Hood at the University of Washington. Dr. Hood, one of the founders of the Institute for Systems Biology in Seattle, was a pioneer in the field of automated gene sequencing, and one of his graduate students at the time, Trey Ideker, was exploring whether it was possible to automate the mapping of gene interactions.
As complex a task as gene sequencing is, charting the multiplicity of interactions that are possible among the roughly 30,000 genes that make up the human chromosome is even more complex. It has led to the emergence of the field of network biology as biologists begin to build computer-aided models of cellular and disease processes.
“Very quickly we realized we weren’t the only ones facing this problem and that others were independently developing software tools,” Dr. Ideker said. The researchers decided to take what at the time was a large risk, and began to develop their code as an open-source software development project, meaning that it could be freely shared by the entire biological community. The project picked up speed when Dr. Ideker, who is now chief of genetics at the U.C.S.D. School of Medicine, merged his efforts with Gary Bader, a biologist who now runs a computational biology laboratory at the University of Toronto.
The project picked up collaborators in the past decade as other researchers decided to contribute to it rather than develop independent tools. The project picked up even more speed because the software was designed so that new modules could be contributed by independent researchers who wanted to tailor it for specific tasks.
“We allowed what we called plug-ins back in 2001 — nowadays with Apple’s success you would call them an app,” he said. “There are a couple of hundred apps available for Cytoscape.” The project is now maintained with a $6.5 million grant from the National Institute of General Medical Sciences at the National Institutes of Health.
Tools like Cytoscape have a symbiotic relationship with immense databases that have grown to support the activities of scientists who are studying newer fields like genomics and proteomics. Gene sequencing led to the creation of Genbank, which is now maintained by the National Center for Biotechnology Information. And with a growing array of digital data streams, other databases are being curated — in Europe, for example, at the European Bioinformatics Institute, which has begun to build an array of new databases for functions like protein interactions. Cytoscape helps transform the disparate databases into a federated whole with the aid of plug-ins that allow a scientist to pick and chose from different sources.
For Dr. Börner, the Indiana University computer scientist, the Cytoscape model is a powerful one that builds on the sharing mechanism that is the foundation of the Internet
The idea, she said, is inspired by witnessing the power and impact of the sharing inherent in Web services like Flickr and YouTube. Moreover, it has the potential of being rapidly replicated across many scientific disciplines.
“You can now also share plug-in algorithms,” she said. “You can now create your own library by plugging in your favorite algorithms into your tool.”

Sunday, April 24, 2011

time machine and dry ice in mars


Dry ice, wetter Mars sciencenews
Newfound cache of frozen carbon dioxide could periodically thicken Red Planet’s atmosphere
Web edition : Thursday, April 21st, 2011
Text Size
access
CAPPEDThe thickness of a newfound reservoir of frozen carbon dioxide at Mars’ south polar cap varies from a few meters (blue) to more than 500 meters (red).JPL-Caltech, U. of Rome, Southwest Research Institute
A newfound reservoir of dry ice on Mars suggests that the planet’s surface has been wetter in the relatively recent past, though not necessarily warmer than it is today.
The new study adds to evidence that Mars once had a carbon dioxide atmosphere thick enough to keep liquid water on the surface from evaporating. It’s unclear whether the planet would have been hospitable for life, however, because temperatures on Mars may actually have been slightly colder during times when the atmosphere had a greater amount of carbon dioxide.
Roger Phillips of the Southwest Research Institute in Boulder, Colo., and his colleagues base their findings on radar studies by the Mars Reconnaissance Orbiter of the layered deposits at Mars’ south polar cap. Earlier studies had indicated that a veneer of frozen carbon dioxide sits atop part of the cap with a thin layer of water ice beneath it. But a detailed analysis of radar reflected from different layers of the cap reveals that beneath the frozen water lies a volume of carbon dioxide ice 30 times greater than previously estimated, the team reports online April 21 in Science.
This unexpected reservoir of dry ice is intriguing, Phillips says, because about every 100,000 years Mars is known to dramatically tilt its spin axis. During these periods of high polar tilt, enough sunlight falls on the poles to vaporize the frozen carbon dioxide and release it into the atmosphere, roughly doubling the atmospheric pressure on the Red Planet. With a denser atmosphere, liquid water could persist on the surface rather than evaporating, and might account for some of the features on Mars that appear to have carved by water, such as channels and gullies, Phillips notes.
Although the newly found reservoir could nearly double the mass of carbon dioxide in Mars’s atmosphere, the resulting climate alterations would be “modest” and would not generate a warmer, wetter Mars, notes Peter Thomas of Cornell University in a commentary also posted online April 21 in Science.
Phillips concurs and notes that during times of higher tilt, more carbon dioxide frost would settle on the planet’s surface. The reflectivity of the surface frost, along with other effects, would offset any greenhouse warming from the extra gas in the atmosphere, and would tend to maintain the chilly temperatures now typical on the Red Planet.
Warmer conditions would require a much thicker carbon dioxide atmosphere supplied by an additional source of the compound, such as carbonates in Martian rocks, says Thomas. The abundance of carbonates in the rocks is still under exploration.

'Time Machine' Made to Visually Explore Space and Time in Videos: Time-Lapse GigaPans Provide New Way to Access Big Data

ScienceDaily (Apr. 22, 2011) — Researchers at Carnegie Mellon University's Robotics Institute have leveraged the latest browser technology to create GigaPan Time Machine, a system that enables viewers to explore gigapixel-scale, high-resolution videos and image sequences by panning or zooming in and out of the images while simultaneously moving back and forth through time.
Viewers, for instance, can use the system to focus in on the details of a booth within a panorama of a carnival midway, but also reverse time to see how the booth was constructed. Or they can watch a group of plants sprout, grow and flower, shifting perspective to watch some plants move wildly as they grow while others get eaten by caterpillars. Or, they can view a computer simulation of the early universe, watching as gravity works across 600 million light-years to condense matter into filaments and finally into stars that can be seen by zooming in for a close up.
"With GigaPan Time Machine, you can simultaneously explore space and time at extremely high resolutions," said Illah Nourbakhsh, associate professor of robotics and head of the CREATE Lab. "Science has always been about narrowing your point of view -- selecting a particular experiment or observation that you think might provide insight. But this system enables what we call exhaustive science, capturing huge amounts of data that can then be explored in amazing ways."
The system is an extension of the GigaPan technology developed by the CREATE Lab and NASA, which can capture a mosaic of hundreds or thousands of digital pictures and stitch those frames into a panorama that be interactively explored via computer. To extend GigaPan into the time dimension, image mosaics are repeatedly captured at set intervals, and then stitched across both space and time to create a video in which each frame can be hundreds of millions, or even billions of pixels.
An enabling technology for time-lapse GigaPans is a feature of the HTML5 language that has been incorporated into such browsers as Google's Chrome and Apple's Safari. HTML5, the latest revision of the HyperText Markup Language (HTML) standard that is at the core of the Internet, makes browsers capable of presenting video content without use of plug-ins such as Adobe Flash or Quicktime.
Using HTML5, CREATE Lab computer scientists Randy Sargent, Chris Bartley and Paul Dille developed algorithms and software architecture that make it possible to shift seamlessly from one video portion to another as viewers zoom in and out of Time Machine imagery. To keep bandwidth manageable, the GigaPan site streams only those video fragments that pertain to the segment and/or time frame being viewed.
"We were crashing the browsers early on," Sargent recalled. "We're really pushing the browser technology to the limits."
Guidelines on how individuals can capture time-lapse images using GigaPan cameras are included on the site created for hosting the new imagery's large data files, http://timemachine.gigapan.org. Sargent explained the CREATE Lab is eager to work with people who want to capture Time Machine imagery with GigaPan, or use the visualization technology for other applications.
Once a Time Machine GigaPan has been created, viewers can annotate and save their explorations of it in the form of video "Time Warps."
Though the time-lapse mode is an extension of the original GigaPan concept, scientists already are applying the visualization techniques to other types of Big Data. Carnegie Mellon's Bruce and Astrid McWilliams Center for Cosmology, for instance, has used it to visualize a simulation of the early universe performed at the Pittsburgh Supercomputing Center by Tiziana Di Matteo, associate professor of physics.
"Simulations are a huge bunch of numbers, ugly numbers," Di Matteo said. "Visualizing even a portion of a simulation requires a huge amount of computing itself." Visualization of these large data sets is crucial to the science, however. "Discoveries often come from just looking at it," she explained.
Rupert Croft, associate professor of physics, said cosmological simulations are so massive that only a segment can be visualized at a time using usual techniques. Yet whatever is happening within that segment is being affected by forces elsewhere in the simulation that cannot be readily accessed. By converting the entire simulation into a time-lapse GigaPan, however, Croft and his Ph.D. student, Yu Feng, were able to create an image that provided both the big picture of what was happening in the early universe and the ability to look in detail at any region of interest.
Using a conventional GigaPan camera, Janet Steven, an assistant professor of biology at Sweet Briar College in Virginia, has created time-lapse imagery of rapid-growing brassicas, known as Wisconsin Fast Plants. "This is such an incredible tool for plant biology," she said. "It gives you the advantage of observing individual plants, groups of plants and parts of plants, all at once."
Steven, who has received GigaPan training through the Fine Outreach for Science program, said time-lapse photography has long been used in biology, but the GigaPan technology makes it possible to observe a number of plants in detail without having separate cameras for each plant. Even as one plant is studied in detail, it's possible to also see what neighboring plants are doing and how that might affect the subject plant, she added.
Steven said creating time-lapse GigaPans of entire landscapes could be a powerful tool for studying seasonal change in plants and ecosystems, an area of increasing interest for understanding climate change. Time-lapse GigaPan imagery of biological experiments also could be an educational tool, allowing students to make independent observations and develop their own hypotheses.
Google Inc. supported development of GigaPan Time Machine.