What the Future is Bringing Us (2013)





'''What the Future is Bringing Us entries are grouped by year. Click on the desired year.'''


 * (2018)


 * (2017)


 * (2016)


 * (2015)


 * (2014)


 * (2013) This is the page you are currently viewing.


 * (2012)


 * (2011)


 * (2010)


 * (2009)


 * (2008)


 * (2007)


 * (2004) Information and Communication Technology (ICT) planning document developed by David Moursund. The goal was to facilitate the development of a sequence of 1-credit (quarter hour system) graduate-level joint preservice and inservice courses to be taught at the University of Oregon.


 * (2000 to 2003) Golden Oldie News Oct-December 2000 up through Jan-March 2003. These materials were moved from an old Oregon Technology Education Council (OTEC) site developed by David Moursund. Most of the links in the referenced articles no longer work.


 * (1987 Futuristic Math Education Scenarios).


 * (1974 to 2001) All of David Moursund's editorials published in Learning and Leading with Technology from its inception in 1974 until he retired from ISTE in 2001.

BigDog is a rough-terrain robot built by Boston Dynamics that walks, runs, climbs and carries heavy loads. BigDog is powered by an engine that drives a hydraulic actuation system. BigDog has four legs that are articulated like an animal’s, with compliant elements to absorb shock and recycle energy from one step to the next. BigDog is the size of a large dog or small mule; about 3 feet long, 2.5 feet tall and weighs 240 lbs. See more pictures of social robots at http://www.bing.com/images/search?q=pictures+social+robots&qpvt=pictures+social+robots&qpvt=pictures+social+robots&FORM=IQFRML.



See MIT's Electric Cheetah Robot video at http://www.engadget.com/2014/09/15/mit-darpa-cheetah-robot/.



The Cray-2 supercomputer was the world's fastest supercomputer until 1990. But even with a performance of up to 1.9 GFLOPS, the liquid-cooled, 200-kilowatt machine ranks behind a number of "modern" portable, battery-powered Smart phones when it comes to GFLOPS ratings.

Year 2013 Table of Contents

 * "All education springs from some image of the future. If the image of the future held by a society is grossly inaccurate, its education system will betray its youth." (Alvin Toffler; American writer and futurist; born October 3, 1928.)


 * "Don't worry about what anybody else is going to do. The best way to predict the future is to invent it. Really smart people with reasonable funding can do just about anything that doesn't violate too many of Newton's Laws!" (Alan Kay; American computer scientist and educator; born May 17, 1940.)

Introduction
All of education is future oriented. Through informal and formal education, students are being prepared for their futures. Of course, a major goal of education is to preserve and pass on the culture, values, history, and so on from the past. Ideally, this is done in a manner that helps prepare students for their futures as members of local, regional, national, and world societies.

Special Message for Teachers. Consider establishing a "futures" time period each week, in which you engage your students in an exploration of possible futures they will live in and how the subject(s) you are teaching are helping to prepare them for these possible futures. One way to do this is to select a topic from this year's list, or other annual lists published on this website. Engage students in a discussion of what they know about the topic. Perhaps point them to some material to read. Engage them in a discussion of how the content you are teaching fits in with preparing them for life in a world in which the forecasts on this website may well come true.

Another approach is to encourage your students to bring in hardcopy materials and Web links that contain forecasts of the future. Each week a different small team of students could assume responsibility for leading the weekly "futures" session.

Still another approach is to raise the following question with your students near the beginning of any new unit of study: "What changes are going on around the world that are having a major impact on this unit of study?" The idea is to emphasize change and the understanding that you are helping your students to get an education that prepares them for a changing world.

Teachers working with students may also be interested in having the students research and report on one or more "futures predictions" from 5 to 10 years ago, or perhaps when they were in first grade, or the year they were born, and so on. They can find out which predictions have become part of our world today and which ones failed to materialize, and why or why not in each case.

The article Two Brains Are Better Than One discusses the capabilities of a human brain versus a computer brain. A student's human brain gets better through maturation, informal education, and formal education. A computer's brain gets better through the development of better hardware and software, based on the work of a large number of researchers and programmers. A student's overall cognitive capabilities gets better through education using an appropriate combination of human and computer brains.

Predictions for 2030
Frey, T. (12/23/2013). 33 dramatic predictions for 2030. The Futurist. Retrieved 4/10/2015 from http://www.wfs.org/blogs/thomas-frey/33-dramatic-predictions-for-2030. Quoting from the article:


 * Humanity will change more in the next 20 years than in all of human history.


 * By 2030 the average person in the U.S. will have 4.5 packages a week delivered with flying drones. They will travel 40% of the time in a driverless car, use a 3D printer to print hyper-individualized meals, and will spend most of their leisure time on an activity that hasn’t been invented yet.


 * The world will have seen over 2 billion jobs disappear, with most coming back in different forms in different industries, with over 50% structured as freelance projects rather than full-time jobs.


 * Over 50% of today’s Fortune 500 companies will have disappeared, over 50% of traditional colleges will have collapsed, and India will have overtaken China as the most populous country in the world.

IBM's Five Year Forecast
IBM recently published its annual five-year forecast for technological changes. See:


 * Relaxnews (12/17/2013). IBM sees five tech-powered changes in next five years. Yahoo News. Retrieved 12/20/2013 from http://news.yahoo.com/ibm-sees-five-tech-powered-changes-next-five-090614295.html.

Quoting from the article:


 * IBM said that its annual forecast of five ways technology will change lives in the coming five years was "driven by a new era of cognitive systems where machines will learn, reason, and engage with us in a more natural and personalized way."

Quoting IBM’s five-year forecasts about education:


 * Predictions for the coming five years included "classrooms of the future" equipped with systems that track and analyze each student's progress to tailor curriculum and help teachers target learning techniques.


 * "Basically, the classroom learns you," IBM vice president of innovation Bernie Meyerson told AFP. "It is surprisingly straight-forward to do."

For David Moursund's comments about this forecast, see http://i-a-e.org/iae-blog/entry/education-for-the-future.html.

Exascale Computer in Ten Years

 * Brewin, Bob (12/16/2013). Congress tells Energy Dept. to develop 'exascale' computers in 10 years. Nextgov. Retrieved 12/20/2013 from http://www.nextgov.com/defense/2013/12/energy-dept-told-develop-exascale-computers-10-years/75568/.

Quoting from the article:


 * Congress is directing the Energy Department to take the next decade to develop a new class of supercomputers capable of a quintillion operations per second to model nuclear weapons explosions, according to language in the 2014 National Defense Authorization Act passed by the House last week, with a Senate vote expected this week.


 * Departmental officials believe they could develop exascale supercomputers within 10 years, according to estimates offered at an Advanced Scientific Computing Advisory Committee meeting in Denver last month.


 * The exascale supercomputers will operate at a speed 1,000 times faster than the current record holder, a machine developed by China’s National University of Defense Technology that performs just under 34 quadrillion calculations per second, William Harrod, an ASCR division director told the conference.

Comment from David Moursund: The author of the article indicates that a qintillion calculations per second is a thousand times as fast as the current Chinese computer that is rated at 34 quadrillion calculations per second. Actually, it is only about 30 times as fast.

I find it disturbing that the use emphasized is modeling nuclear weapons explosions. A "saving grace" of the article is that it goes on to explain:


 * Besides weapons research and simulation, Harrod said exascale computers would help support processing of complex “big data” sets, including climate modeling and genomics, with the first system slated to go into operation in 2023.

SLAC National Accelerator Laboratory
Quoting from the Wikipedia: "SLAC National Accelerator Laboratory, originally named Stanford Linear Accelerator Center,is a United States Department of Energy National Laboratory operated by Stanford University under the programmatic direction of the U.S. Department of Energy Office of Science." Here is an article about some breakthrough research.


 * SLAC National Accelerator Laboratory Press Release (11/21/2013). Will 2-D tin be the next supermaterial? Retrieved 12/15/2013 from https://www6.slac.stanford.edu/news/2013-11-21-tin-super-material-stanene.aspx.

Quoting from the article:


 * A single layer of tin atoms could be the world’s first material to conduct electricity with 100 percent efficiency at the temperatures that computer chips operate, according to a team of theoretical physicists led by researchers from the U.S. Department of Energy’s (DOE) SLAC National Accelerator Laboratory and Stanford University.


 * Researchers call the new material "stanene," combining the Latin name for tin (stannum) with the suffix used in graphene, another single-layer material whose novel electrical properties hold promise for a wide range of applications.


 * "Stanene could increase the speed and lower the power needs of future generations of computer chips, if our prediction is confirmed by experiments that are underway in several laboratories around the world," said the team leader, Shoucheng Zhang, a physics professor at Stanford and the Stanford Institute for Materials and Energy Sciences (SIMES), a joint institute with SLAC.

What Comes After Silicon?
Thibodeau, Parrick (11/19/2013). Supercomputing's big problem: What's after silicon? Computerworld. Retrieved 11/20/2013 from http://www.computerworld.com/s/article/9244179/Supercomputing_s_big_problem_What_s_after_silicon_. Quoting from the article:


 * Supercomputing users are relentless in their pursuit of compute power so they can run simulations of increasing complexity and scale to tackle mankind's truly big problems. But Moore's Law, once a reliable predictor of computing power's future, has reached its limits.


 * Supercomputing researchers aren't sure what's next.


 * But change will arrive, and it will likely be disruptive.


 * In switching from standard semiconductor technologies, there's "no reason to believe that the same companies [leading the industry] now will necessarily have the big advantage," said Peter Beckman, a top computer scientist at the Department of Energy's Argonne National Laboratory, and head of an international exascale software effort.

The article discusses various technologies thaat have a good chance of extending the rapid grow of computer technology that we have become used to over the past decades.

Super Capacitors
Cooney, Michael (10/23/2013). Silicon-based supercapacitor could change way electricity is stored. NetworkWorld. Retrieved 10/27/2013 from http://www.networkworld.com/community/node/84090. Quoting from the article:


 * Vanderbilt University researchers say they have come up with a way to store electricity on a silicon-based supercapacitor that would let mobile phones recharge in seconds and let them continue to operate for weeks without recharging.


 * Cary Pint, the assistant professor of mechanical engineering who headed the development said the group is currently using this approach to develop energy storage that can be formed in the excess materials or on the unused back sides of solar cells and sensors. The supercapacitors would store excess the electricity that the cells generate at midday and release it when the demand peaks in the afternoon.

I spent some time browsing the technical paper (see http://www.nature.com/srep/2013/131022/srep03020/full/srep03020.html) that reports on this research. It is clearly over my head, but I find it interesting that there is substantial underlying theory and it is being translated into products. If these products can be mass produced at a "reasonable" cost, the net result will be a very major change in the battery industry. The potable electronic device example of a cell phone, and the example of storing energy from solar panels, represent huge breakthroughs that will help contribute to quality of life throughout the world.

Smaller and Faster Computers
Morgan, James (10/18/2013). IBM unveils computer fed by 'electronic blood.' BBC News. Retrieved 10/24/2013 from http://www.bbc.co.uk/news/science-environment-24571219. Quoting from the article:


 * "The human brain is 10,000 times more dense and efficient than any computer today.


 * [IBM's] new "redox flow" system pumps an electrolyte "blood" through a computer, carrying power in and taking heat out.


 * Their vision is that by 2060, a one petaflop computer that would fill half a football field today, will fit on your desktop.


 * "We want to fit a supercomputer inside a sugar cube. To do that, we need a paradigm shift in electronics—we need to be motivated by our brain," says Michel.

Internet in Outer Space
Gaudin, Sharon (10/22/2013). NASA says first space Internet test 'beyond expectations.' Computerworld. Retrieved 10/24/2013 from http://www.computerworld.com/s/article/9243426/NASA_says_first_space_Internet_test_beyond_expectations_. Quoting from the website:


 * NASA's lunar spacecraft, the Lunar Atmosphere and Dust Environment Explorer or LADEE, has begin a month-long test of a high data-rate laser communication system.


 * If the system works as planned, similar laser systems are expected to replace radio systems to speed up future satellite communications as well as deep space communications with robots and human exploration crews.

Computers Based on Nanotube Technology
Gaudin, Sharon (9/30/2013). Replacing silicon with nanotubes could revolutionize tech. Computerworld. Retrieved 10/6/2013 from http://www.computerworld.com/s/article/9242812/Replacing_silicon_with_nanotubes_could_revolutionize_tech. Quoting from the article:


 * Scientists at Stanford University last week announced they had built the first functioning computer that used only carbon nanotube transistors.


 * "Hypothetically, if researchers can remove the current issues by using carbon nanotubes, and manufacturers adopt it, they could build smaller everything with less power required," he added.


 * Moorhead said if carbon nanotubes become a viable alternative to silicon, major manufacturers like Intel, Samsung and TSMC should quickly look to move to the new technology, despite the high cost of reworking their manufacturing processes. "The new technology would impact every device that uses chips, from cars to watches to phones to computers," he added.

Printable Computers
Marks, Paul (10/3/2013). Print a working computer on an $80 inkjet. NewScientist. Retrieved 10/4/2013 from http://www.newscientist.com/article/mg22029374.500-print-a-working-paper-computer-on-an-80-inkjet.html#.Uk8SLSR1Fe8. Quoting from the article:


 * Hodges, along with Yoshihiro Kawahara and his team at the University of Tokyo, Japan, have found a way to print the fine, silvery lines of electronic circuit boards onto paper. What's more, they can do it using ordinary inkjet printers, loaded with ink containing silver nanoparticles. Last month Kawahara demonstrated a paper-based moisture sensor at the Ubicomp conference in Zurich, Switzerland.


 * If silver-based inkjet printing can be made affordable, Hodges says it will be a natural follow-on to Bare Conductive's hand-drawn and paintable circuitry. Kawahara goes further: "In 20 years you really will be able to hit 'Print' and make yourself a mobile phone".

Carbon Nanotube Technology
Anderson, Thomas (9/26/2013). Stanford engineers build computer using carbon nanotube technology. ''TGD. ''Retrieved 12/19/2013 from http://www.tgdaily.com/general-sciences-features/80067-stanford-engineers-build-computer-using-carbon-nanotube-technology. Quoting from the article:


 * A team of Stanford engineers has built a basic computer using carbon nanotubes, a semiconductor material that has the potential to launch a new generation of electronic devices that run faster, while using less energy, than those made from silicon chips. This unprecedented feat culminates years of efforts by scientists around the world to harness this promising material.


 * "People have been talking about a new era of carbon nanotube electronics moving beyond silicon," said Mitra, an electrical engineer and computer scientist, and the Chambers Faculty Scholar of Engineering. "But there have been few demonstrations of complete digital systems using this exciting technology. Here is the proof."


 * Though it could take years to mature, the Stanford approach points toward the possibility of industrial-scale production of carbon nanotube semiconductors, according to Naresh Shanbhag, a professor at the University of Illinois at Urbana-Champaign and director of SONIC, a consortium of next-generation chip design research.

A New Computer Memory Device
Science News (9/13/2013). The '50-50' Chip: Memory device of the future? Material built from aluminum and antimony shows promise for next-generation data-storage devices. Retrieved 9/22/2013 from http://www.sciencedaily.com/releases/2013/09/130913113340.htm. Quoting from the article:


 * A new, environmentally-friendly electronic alloy consisting of 50 aluminum atoms bound to 50 atoms of antimony may be promising for building next-generation "phase-change" memory devices, which may be the data-storage technology of the future, according to a new paper published in the journal Applied Physics Letters, which is produced by AIP Publishing.


 * Flash memory has problems when devices get smaller than 20 nanometers. But a phase-change memory device can be less than 10 nanometers -- allowing more memory to be squeezed into tinier spaces. "That's the most important feature of this kind of memory," said Xilin Zhou of the Shanghai Institute of Microsystem and Information Technology at the Chinese Academy of Sciences. Data can also be written into phase-change memories very quickly and the devices would be relatively inexpensive, he added

Human and Computer Brains
Mirani, L. (8/15/2013). Why we’re a long way from computers that really work like the human brain. Quartz. Retrieved 8/30/2013 from http://qz.com/114699/why-were-a-long-way-from-computers-that-really-work-like-the-human-brain/. Quoting from the article:


 * The trouble is that at the moment, no computer is powerful enough to run a program simulating the brain. One reason is the brain’s interconnected nature. In computing terms, the brain’s nerve cells, called neurons, are the processors, while synapses, the junctions where neurons meet and transmit information to each other, are analogous to memory. Our brains contain roughly 100 billion neurons; a powerful commercial chip holds billions of transistors. Yet a typical transistor has just three legs, or connections, while a neuron can have up to 10,000 points of connection, and a brain has some 100 trillion synapses. ”There’s no chip technology which can represent this enormous amount of wires,” says Diesmann.


 * Even deciding what counts as “simulating the brain” is tricky. Diesmann’s team ran the K computer, a Japanese supercomputer that is the world’s fourth fastest, and simulated the activity of 1.73 billion nerve cells connected by 10.4 trillion synapses—or about 1% as many as in the human brain. By contrast, IBM last year simulated 530 billion neurons and 100 trillion synapses. But Diesmann’s experiment has been called ”the largest general neuronal network simulation to date,” because his team’s version was more sophisticated.

The Future of Moore's Law
Merritt, R. (8/27/2013). Moore's Law dead by 2022, expert says. EE Times. Retrieved 8/28/2013 from http://www.eetimes.com/document.asp?doc_id=1319330. Quoting from the article:


 * "For planning horizons, I pick 2020 as the earliest date we could call it dead," said Robert Colwell, who seeks follow-on technologies as director of the microsystems group at the Defense Advanced Research Projects Agency. "You could talk me into 2022, but whether it will come at 7 or 5nm, it's a big deal," said the engineer who once managed a Pentium-class processor design at Intel.

See also, the article at http://www.computerworld.com/s/article/9244570/Moore_s_Law_isn_t_making_chips_cheaper_anymore.

Comment by Dave Moursund Of course, this does not mean that we will cease to build faster computers and to develop more powerful applications of our computers. Also, our K-12 educational system is lagging far behind in effective use of our current computer technology. So there is huge room for improvements now and in the next couple of decades.

Language Translation by Computer
van Diggelen, A. (8/20/2013). Silicon Valley seeks to translate the world. The PRI's World. Retrieved 8/26/2013 from http://www.theworld.org/2013/08/machine-language-translation/. Quoting from the document:


 * At the Google headquarters in Silicon Valley, a team works on translation software. Senior Communications Associate Roya Soleimani demonstrates how it works.


 * With a single click, she can access web pages in over 70 languages, and get instant translations on her smartphone of text, speech, and even photos she’s taken of menus and street signs.…


 * Remarkably, there’s not one linguist on Google’s core team: they’re all engineers or statisticians, using computers to analyze billions of translations every day.

Progress in Solid State Drives (SSD)
Shah, A. (8/26/2013). SSDs still maturing, new memory tech still 10 years away. NetworkWorld. Retrieved 8/26/2013 from http://www.networkworld.com/news/2013/082613-ssds-still-maturing-new-memory-273168.html?source=nww_rss. Quoting from the article:


 * SSDs built on flash memory are now considered an alternative to spinning hard-disk drives, which have reached their speed limit. Mobile devices have moved over to flash drives, and a large number of thin and light ultrabooks are switching to SSDs, which are smaller, faster and more power efficient. However, the enterprise market still relies largely on spinning disks, and SSDs are poised to replace hard disks in server infrastructure, experts said. One of the reasons: SSDs are still more expensive than hard drives, though flash price is coming down fast.

Denser, Faster Memory
Simonite, T. (8/14/2013). Denser, faster memory challenges both DRAM and Flash. Technology Review. Retrieved 8/14/2013 from http://www.technologyreview.com/news/517996/denser-faster-memory-challenges-both-dram-and-flash/. Quoting from the article:


 * A new type of memory chip that a startup company has just begun to test could give future smartphones and other computing devices both a speed and storage boost. The technology, known as crossbar memory, can store data about 40 times as densely as the most compact memory available today. It is also faster and more energy efficient.


 * The technology’s ability to store a lot of data in a small space could see it replace the flash memory chips that are the basis of memory cards, some hard drives, and the internal storage of mobile devices. Data can be accessed and written to crossbar memory fast enough to see it also possibly compete with DRAM, used as short-term memory, in computing devices. The technology is significantly more energy efficient than both flash and DRAM.

Simulating a Human Brain
Harris, D. (8/2/2013). Simulating 1 second of a real brain activity takes 40 minutes and 83K processors. Gigaon. Retrieved 8/7/2013 from http://gigaom.com/2013/08/02/simulating-1-second-of-real-brain-activity-takes-40-minutes-83k-processors/. Quoting from the article:


 * A team of Japanese and German researchers have carried out the largest-ever simulation of neural activity in the human brain, and the numbers are both amazing and humbling.


 * The hardware necessary to simulate the activity of 1.73 billion nerve cells connected by 10.4 trillion synapses (just 1 percent of a brain’s total neural network) for 1 biological second: 82,944 processors on the K supercomputer and 1 petabyte of memory (24 bytes per synapse). That 1 second of biological time took 40 minutes, on one of the world’s most-powerful systems, to compute.

Quoting project leader Markus Diesmann:


 * “If peta-scale computers like the K computer are capable of representing 1% of the network of a human brain today, then we know that simulating the whole brain at the level of the individual nerve cell and its synapses will be possible with exa-scale computers hopefully available within the next decade.”

China Now Has World's Fastest Computer
Thibodeau, P. (June 3, 2013). China surpassing U.S. with 54.9 petaflop supercomputer. Computerworld. Retrieved 5/6/2013 from http://www.computerworld.com/s/article/9239710/China_surpassing_U.S._with_54.9_petaflop_supercomputer. Quoting from the article:


 * China has produced a supercomputer capable of 54.9 petaflops, more than twice the speed of any system in the U.S., according to a U.S. researcher who was in China last week and learned the details.


 * China's latest system was built with Intel chips, but includes indigenously produced Chinese technologies as well. The Chinese government spent about $290 million on it.

For additional information, see http://www.networkworld.com/news/2013/061713-china-trounces-us-in-top500-270883.html.

Comment by Dave MoursundIt takes a lot of money to be at and stay at the forefront of research in the various Science, Technology, Engineering, and Math (STEM) areas. Some of the money goes into hardware, such as fast computers, telescopes, and other hardware in orbits around the earth. Other money goes into the Large Hadron Collider, and so on.

However, well-educated and experienced researchers who are supported to do the research are a key ingredient.

Another key ingredient is worldwide collaboration and sharing. Many of the problems that are currently being worked on in the STEM areas are of worldwide interest and value. It behooves all countries of the world to improve the international research collaboration and sharing of results. A great many countries can make (indeed, are making) significant contributions.

Computer Model of Human Brain
Keats, J. (5/14/2013). Thought experiment: Build a Supercomputer replica of the human brain. Wired. Retrieved 5/20/2013 from http://www.wired.com/wiredscience/2013/05/neurologist-markam-human-brain/. Quoting from the article:


 * [Henry Markram] claims that the only thing preventing scientists from understanding the human brain in its entirety—from the molecular level all the way to the mystery of consciousness—is a lack of ambition. If only neuroscience would follow his lead, he insists, his Human Brain Project could simulate the functions of all 86 billion neurons in the human brain, and the 100 trillion connections that link them. And once that’s done, once you’ve built a plug-and-play brain, anything is possible. You could take it apart to figure out the causes of brain diseases. You could rig it to robotics and develop a whole new range of intelligent technologies. You could strap on a pair of virtual reality glasses and experience a brain other than your own....


 * And now Markram has funding almost as outsized as his ideas. On January 28, 2013, the European Commission—the governing body of the European Union—awarded him 1 billion euros ($1.3 billion). For decades, neuroscientists and computer scientists have debated whether a computer brain could ever be endowed with the intelligence of a human. It’s not a hypothetical debate anymore. Markram is building it. Will he replicate consciousness? The EU has bet $1.3 billion on it.

Comment by Dave Moursund: Progress in artificial intelligence and in major projects such as the one described above present a major challenge to our educational system. What do we want students to learn for their future life in an environment that includes machines that are steadily increasing in cognitive capabilities? In my opinion, we need to be placing increasing emphasis on representing and solving problems in the combined environment of human brains and computer brains working together. See http://iae-pedia.org/Computational_Thinking.

Approaching the End of Moore's Law
Paul, I. (4/4/2013). The end of Moore's Law on the horizon, says AMD. ComputerWorld. Retrieved 4/7/2013 from http://www.computerworld.com/s/article/9238117/The_end_of_Moore_s_Law_on_the_horizon_says_AMD. Quoting from the article:


 * Theoretical physicist Michio Kaku believes Moore's Law has about 10 years of life left before ever-shrinking transistor sizes smack up against limitations imposed by the laws of thermodynamics and quantum physics. …


 * AMD's argument may also reveal a bit of corporate bias related to the company's recent struggles. While AMD's chips are currently stuck at 28nm, Intel is pushing ahead with smaller and smaller designs. Intel currently produces 22nm chips for its latest generation of Core processors, Ivy Bridge. The next generation, Haswell, will also feature a 22nm process. Intel, in 2014, expects to produce 14nm Haswell chips, and the company is aiming to produce 10nm chips by 2016.

Comment by Dave MoursundFrom time to time I contemplate what the K-12 educational impact would be if Intel and others reached the stage where it was no longer financially feasible to produce still more densely packed, strongly interconnected sets of transistors. With current capabilities and costs, we could provide every student in America with a networked tablet computer for about two percent of the school budget. The value of this connectivity will grow with the growth in relevant software and better integration of computer capabilities into the curriculum. Does it make a lot of difference whether this can occur for one percent of the school budget rather than for two percent?

Robotics
A roadmap for U.S. robotics: From Internet to robotics (2013 ed.). Retrieved 3/26/2013 from http://www.gatech.edu/newsroom/release.html?nid=200741. Quoting from the 127-page report:


 * Last year, robotics celebrated its 50-year anniversary in terms of deployment of the first industrial robot at a manufacturing site. Since then, significant progress has been achieved. Robots are being used across the various domains of manufacturing, services, healthcare/medical, defense, and space. Robotics was initially introduced for dirty, dull, and dangerous tasks. Today, robotics are used in a much wider set of applications, and a key factor is to empower people in their daily lives across work, leisure, and domestic tasks. Three factors drive the adoption of robots: i) improved productivity in the increasingly competitive international environment; ii) improved quality of life in the presence of a significantly aging society; and iii) removing first responders and soldiers from the immediate danger/action. Economic growth, quality of life, and safety of our first responders continue to be key drivers for the adoption of robots.


 * Robotics is one of a few technologies that has the potential to have an impact that is as transformative as the Internet.

Robots and the Workplace
Kang, C. (6/6/2013). New robots in the workplace: Job creators or job terminators. The Washington Post. Retrieved 3/10/2013 from http://www.washingtonpost.com/business/technology/new-robots-in-the-workplace-job-creators-or-job-terminators/2013/03/06/a80b8f34-746c-11e2-8f84-3e4b513b1a13_story.html. Quoting from the article:


 * Today’s robots can do far more than their primitive, single-task ancestors. And there is a broad debate among economists, labor experts and companies over whether the trend will add good-paying jobs to the economy by helping firms run more efficiently or simply leave human workers out in the cold.


 * “We’ve reached a tipping point in robotics,” said Daniela Rus, director of MIT’s Computer Science and Artificial Intelligence Laboratory. The possibility is to run a factory, she added, “all while you are sleeping.”

Comment by Dave MoursundHere is an example. Quoting again from the article:
 * Already on the market is Baxter, a robot developed by a former director of MIT’s lab. It launched in September and is being used by plastics and metal manufacturing firms. With red plastic arms and a cartoon face, it can do the job of two or more workers, simultaneously unpacking pipe fittings from a conveyer belt while it weighs and places mirrors into boxes. When a human blocks its path, Baxter stops, its eyes widen, and then it courteously gets out of the way. [Bold added for emphasis.]

Increasing Demands for Computer Storage
Jacobi, J.L. (1/29/2013). Wanted: 40 trillion gigabytes of open storage, stat!. PC World. Retrieved 2/1/2013 from http://www.pcworld.com/article/2025423/wanted-40-trillion-gigabytes-of-open-storage-stat-.html. Quoting from the article:


 * Industry research firm IDC predicts a 50-fold increase in the total amount of digitally stored data between 2010 and 2020. This means that in the next seven years, the world's total data footprint will reach 40 zettabytes (that's 40 trillion gigabytes), and every man, woman and child on the planet will account for some 5.2 terabytes of data whether in the cloud or local storage.


 * That's a lot of ones and zeroes, and, unfortunately, help is not on the way in the form of some high-tech, sci-fi breakthrough. To wit: Holographic storage will not bail us out in the next seven years. Nope, between now and 2020, the heavy lifting will be done by that venerable mainstay of storage—the mechanical hard drive. And, yes, USB flash drives, SSDs (solid-state drives), optical drives, and even tape backup systems will also remain in play.

Future of Supercomputers
Service, R.F. (1/23/2013). Live chat: The future of supercompters. American Association for the Advancement of Science. Retrieved 2/1/2013 from http://news.sciencemag.org/sciencenow/2013/01/live-chat-the-future-of-supercom.html. Quoting from the document:


 * Here are just a few quick notes on the units we’re talking about, so we’re all on the same page. The speed of computers is measured in something called floating point operations per second, or FLOPS. The world’s fastest supercomputer today runs at just over 17.6 petaflops, which is 17.6 X 10 to the 15 flops. The next major milestone in supercomputing is an exaflop (10 to the 18 flops) or 1,000 petaflops.


 * Supercomputers enable simulation - that is, the numerical computations to understand and predict the behavior of scientifically or technologically important systems - and therefore accelerate the pace of innovation. Simulation enables better and more rapid product design. Simulation has already allowed Cummins to build better diesel engines faster and less expensively, Goodyear to design safer tires much more quickly, Boeing to build more fuel-efficient aircraft, and Procter & Gamble to create better materials for home products. Simulation also accelerates the progress of technologies from laboratory to application. Better computers allow better simulations and more confident predictions. The best machines today are 10,000 times faster than those of 15 years ago, and the techniques of simulation for science and national security have been improved.


 * Over the past few decades, supercomputers have grown steadily faster and more powerful. Today's top machine is able to crunch 17.6 million billion calculations per second, or 17.6 petaflops. Researchers are now pondering making their next great leap in supercomputing power, up to 1000 petaflops, also known as an exaflop. Exascale computers would be more than 50 times faster than current machines, and thus ideal for running complex simulations such as determining the role of clouds in climate change and modeling new engine designs to burn advanced biofuels.


 * But developing exascale supercomputers is expected to require a revolution in supercomputer design and technology and cost hundreds of millions, if not billions, of dollars. Will it be possible to develop exascale supercomputers? If so, when can we expect to see them? What will they be capable of doing? And which governments will step up to foot the bill?

One Million Core Supercomputer
Myers, A. (1/25/2013). Stanford researchers break million-core supercomputer barrier. Retrieved 1/30 2013 from http://engineering.stanford.edu/news/stanford-researchers-break-million-core-supercomputer-barrier. Quoting from the article:


 * Stanford Engineering's Center for Turbulence Research (CTR) has set a new record in computational science by successfully using a supercomputer with more than one million computing cores to solve a complex fluid dynamics problem—the prediction of noise generated by a supersonic jet engine.


 * Joseph Nichols, a research associate in the center, worked on the newly installed Sequoia IBM Bluegene/Q system at Lawrence Livermore National Laboratories (LLNL) funded by the Advanced Simulation and Computing (ASC) Program of the National Nuclear Security Administration (NNSA). Sequoia once topped list of the world's most powerful supercomputers, boasting 1,572,864 compute cores (processors) and 1.6 petabytes of memory connected by a high-speed five-dimensional torus interconnect.

Comment by Dave MoursundI find this article interesting for two reasons: 1. It is indicative of progress being made in building supercomputers through the use of an increasing number of cores. Since steady progress is being made in developing multi-core processing units, this portends well for the development of still more powerful computers in the future. 2. It provides a good example of practical applications of a supercomputer. Computer modeling and simulation using the most powerful supercomputers is an essential element of problem solving in the STEM areas.

Sending the Watson Computer System to School
Hill, M. (1/30/2013). IBM sends Watson to NY College to boost its skills. Retrieved 1/30/20q3 from http://customwire.ap.org/dynamic/stories/U/US_WATSON_TO_COLLEGE?SITE=NDBIS&SECTION=HOME&TEMPLATE=DEFAULT&CTIME=2013-01-30-03-04-36.Quoting from the article:


 * BM is announcing Wednesday that it will provide a Watson system to Rensselaer Polytechnic Institute, the first time the computer is being sent to a university. Just like the flesh-and-blood students who will work on it, Watson is leaving home to sharpen its skills. Course work will include English and math.
 * BM, which provided a grant to RPI to operate Watson for three years, sees it as a way to help it boost the computer's cognitive capabilities.

Comment by Dave MoursundI find this to be a very interesting idea. IBM benefits by having a number of RPI faculty and students contribute their knowledge and skills. RPI benefits by having free access to one of the world's most "intelligent" and powerful computer systems. This is a win-win situation that will likely be repeated in future collaborations between IBM and AI-oriented research universities.

Author
This page was created by and is maintained by David Moursund.