What the Future is Bringing Us (2011)



'''What the Future is Bringing Us entries are grouped by year. Click on the desired year.'''


 * (2017)


 * (2016)


 * (2015)


 * (2014)


 * (2013)


 * (2012)


 * (2011) This is the page you are currently viewing.


 * (2010)


 * (2009)


 * (2008)


 * (2007)


 * (2000 to 2003) Golden Oldie News Oct-December 2000 up through Jan-March 2003. These materials were moved from an old Oregon Technology Education Council (OTEC) site developed by David Moursund). Most of links in the referenced articles no longer work.


 * (1987 Futuristic Math Education Scenarios).


 * (1974 to 2001) All of David Moursund's editorials published in Learning and Leading with Technology from its inception in 1974 until he retired from ISTE in 2001.

This IAE-pedia page is protected against changes by readers.

BigDog is a rough-terrain robot built by Boston Dynamics that walks, runs, climbs and carries heavy loads. BigDog is powered by an engine that drives a hydraulic actuation system. BigDog has four legs that are articulated like an animal’s, with compliant elements to absorb shock and recycle energy from one step to the next. BigDog is the size of a large dog or small mule; about 3 feet long, 2.5 feet tall and weighs 240 lbs. See more pictures of social robots at http://www.bing.com/images/search?q=pictures+social+robots&qpvt=pictures+social+robots&qpvt=pictures+social+robots&FORM=IQFRML.



See MIT's Electric Cheetah Robot video at http://www.engadget.com/2014/09/15/mit-darpa-cheetah-robot/.



The Cray-2 supercomputer was the world's fastest supercomputer until 1990. But even with a performance of up to 1.9 GFLOPS, the liquid-cooled, 200-kilowatt machine ranks behind a number of "modern" portable, battery-powered Smart phones when it comes to GFLOPS ratings.

Year 2011 Table of Contents



 * "All education springs from some image of the future. If the image of the future held by a society is grossly inaccurate, its education system will betray its youth." (Alvin Toffler; American writer and futurist; born October 3, 1928.)


 * "Don't worry about what anybody else is going to do… The best way to predict the future is to invent it. Really smart people with reasonable funding can do just about anything that doesn't violate too many of Newton's Laws!" (Alan Kay; American computer scientist and educator; born May 17, 1940.)

Introduction
All of education is future oriented. Through informal and formal education, students are being prepared for their futures. Of course, a major goal of education is to preserve and pass on the culture, values, history, and so on from the past. Ideally, this is done in a manner that helps prepare students for their futures as members of local, regional, national, and world societies.

Special Message for Teachers. Consider establishing a "futures" time period each week, in which you engage your students in an exploration of possible futures they will live in and how the subject(s) you are teaching are helping to prepare them for these possible futures. One way to do this is to select a topic from the (growing) list given below. Engage students in a discussion of what they know about the topic. Perhaps point them to some material to read. Engage them in a discussion of how the content you are teaching fits in with preparing them for life in a world in which the forecasts may well come true.

Another approach is to encourage your students to bring in hard copy materials and Web links that contain forecasts of the future. Each week a different small team of students could assume responsibility for leading the weekly "futures" session.

Still another approach is to raise the following question with your students near the beginning of any new unit of study: What changes are going on around the world that are having a major impact on this unit of study? The idea is to emphasize change and that you are helping your students get an education that prepares them for a changing world.

Teachers working with students may also be interested in having the students research and report on a "futures prediction" from five or ten years ago, or perhaps when they were in first grade, or the year they were born, and so on. They can find out which predictions have become part of our world today and which ones failed, and why or why not in each case.

A Free Book on Future of ICT in Education
The following book is available free on the Web in both PDF and Microsoft Word formats.


 * Moursund, D.G. (2005). Planning, Forecasting, and Inventing Your Computers-in-Education Future. Eugene, OR: Information Age Education. Retrieved 11/20/2010 from http://i-a-e.org/downloads/doc_download/23-planning-forecasting-and-inventing-your-computers-in-education-future.html.

Quoting from the Preface:


 * I strongly believe that our education system can be a lot better than it currently is. Indeed, I predict that during the next two decades we will substantially improve our educational system. In this book, I enlist your help in making this prediction come true.


 * The focus in this book is on two aspects of improving our educational system:
 * Improving the quality of education that K-12 students are receiving.
 * Improving the professional lives of teachers and other educators.
 * This book is mainly designed for preservice and inservice teachers and other educators. If you fall into this category, you will find that this book focuses on your possible futures of Information and Communication Technology (ICT) in education. It will do this by:
 * Helping you make and implement some ICT-related decisions that will likely prove very important to you during your professional career in education.
 * Helping you to increase your productivity and effectiveness as you work to improve the quality of education being received by your students.

Some Forecasts
This section contains relatively recent forecasts of future technology that are important to our current and future educational systems. For the most part, the most recent entries are at the top of this section.

Human-Computer Interfaces
Levinson, Meridith (n.d.). The future of human computer interfaces. CIO. Retrieved 11/9/2011 from http://www.cio.com/article/693187/The_Future_of_Human_Computer_Interfaces. Quoting from the article:


 * Researchers at the MIT Media Lab's Fluid Interfaces Group are prototyping new, novel and more natural ways for people to interact with computers and access and store information. Their innovations have been designed to improve and enrich our personal and professional lives, making it easier to create, communicate, collaborate and even cook. Here are 10 inventions that enhance human-computer interactions, improve the in-store shopping experience, and even help us kick bad habits.

Here are two examples that I found appealing:


 * Seth Hunter designed an interactive table where workers can comfortably sit and take and store notes. The MemTable consists of two projectors, two cameras, two mirrors, and software that supports brainstorming, decision-making, event planning and story boarding, as well as five different types of inputs (text, image capture, sketching, laptop capture and audio).


 * TaPuMa stands for tangible public maps. These are digital, touch-screen maps on which you can place everyday objects (e.g. plane tickets, cash, credit cards, cell phones, chewing gum) and get access to relevant, just-in-time location information. If you were using a TaPuMa at an airport, for example, and you placed your boarding pass on the map, the map would show you how to get to your gate. If you put your credit card on the map, it could tell you the location of an ATM or money exchange.

Ray Kurzweil and the Singularity
Kurzweil, Ray (10/19/2011). Kurzweil responds: Don't underestimate the Singularity. Technology Review. Retrieved 11/7/2011 from http://www.technologyreview.com/blog/guest/27263/?p1=A4.

Ray Kurzweil and Paul Allen are exchanging "verbal blows" about the future. The article cited above is Kurzweil's response to an article by Paul Allen and Mark Green. Kurzweil provides a summary of his insights into the future. He argues that we will continue to have more powerful computers and also better algorithms. The following quote from "Report to the President and Congress, Designing a Digital Future" captures these ideas:


 * Even more remarkable—and even less widely understood—is that in many areas, performance gains due to improvements in algorithms have vastly exceeded even the dramatic performance gains due to increased processor speed. The algorithms that we use today for speech recognition, for natural language translation, for chess playing, for logistics planning, have evolved remarkably in the past decade ... Here is just one example, provided by Professor Martin Grötschel of Konrad-Zuse-Zentrum für Informationstechnik Berlin. Grötschel, an expert in optimization, observes that a benchmark production planning model solved using linear programming would have taken 82 years to solve in 1988, using the computers and the linear programming algorithms of the day. Fifteen years later—in 2003—this same model could be solved in roughly one minute, an improvement by a factor of roughly 43 million. Of this, a factor of roughly 1,000 was due to increased processor speed, whereas a factor of roughly 43,000 was due to improvements in algorithms! Grötschel also cites an algorithmic improvement of roughly 30,000 for mixed integer programming between 1991 and 2008. The design and analysis of algorithms, and the study of the inherent computational complexity of problems, are fundamental subfields of computer science.

Low-cost Electronic Tablet
Ruth, Boyd, and Meng (10/3/2011). Low-cost electronic tablet proves worth in Indian classroom. Rice University Media and New Relations. Retrieved 10/5/2011 from http://www.media.rice.edu/media/NewsBot.asp?MODE=VIEW&ID=16258&SnID=1142525090. Quoting from the document:


 * The U.S.- and Singapore-based creators of the low-cost I-slate electronic tablet are preparing for full-scale production now that a yearlong series of tests has shown that the device is an effective learning tool for Indian children.


 * The I-slate, an electronic version of the hand-held blackboard slates used by millions of Indian children, will eventually be solar-powered for use in classrooms that lack electricity. It is being developed by researchers at the Institute for Sustainable and Applied Infodynamics (ISAID), a joint program of Rice University in Houston and Nanyang Technological University (NTU) in Singapore. When mass-produced, the solar-powered I-slate is expected to cost less than $50 (64 Singapore dollars).

The document includes a 3-minute video. The new product is a good indicator of things to come. Just as long-ago children routinely used a small slate and chalk in place of paper and pencil, we are moving toward all students having a computer equivalent of such an aid to learning. As indicated in the following quoted paragraph, it appears that initial uses will be for computer-assisted learning.


 * In March, the researchers examined whether the I-slate helped students' improve in mathematics. Students use a stylus to tap and write out mathematics problems on the I-slate. They get immediate feedback about correct and incorrect answers. When answers are incorrect, the machine gives hints and tips about how to correct mistakes.

That, of course, is a long way from integrating the problem-solving capabilities of computers and connectivity into the everyday curriculum.

Interview with Google's Director of Research
Simonite, Tom (9/22/2011). Searching for new ideas. Technology Review. Retrieved 9/23/2011 from http://www.technologyreview.com/web/38653/. Quoting from the article:


 * If anyone can preview the future of computing, it should be Alfred Spector, Google's director of research. Spector's team focuses on the most challenging areas of computer science research with the intention of shaping Google's future technology. During a break from a National Academy of Engineering meeting on emerging technologies hosted by his company, Spector told Technology Review's computing editor Tom Simonite about these efforts, and explained how Google funnels its users' knowledge into artificial intelligence.::


 * TR: Google often releases products based on novel ideas and technologies. How is the research conducted by your team different from the work carried out by other groups?


 * Spector: We also work on things that benefit Google and its users, but we have a longer time horizon and we try to advance the state of the art. That means areas like natural language processing [understanding human language], machine learning, speech recognition, translation, and image recognition. These are mostly problems that have traditionally been called artificial intelligence.


 * We have the significant advantage of being able to work in vitro on the large systems that Google operates, so we have large amounts of data and large numbers of users.

Five Tech Breakthroughs
Nadel, Brian (9/13/2011). Five tech breakthroughs: Chip-level advances that may change computing. ComputerWorld. Retrieved 9/17/2011 from http://www.computerworld.com/s/article/9219723/5_tech_breakthroughs_Chip_level_advances_that_may_change_computing.

Roughly speaking, "experts" can do a relatively good job of predicting what ICT types of products will be available five years in the future. That is because it tends to take about five years to move from "successful research" to a marketable product. Quoting from the article:


 * Imagine a world with electronic devices that can power themselves, music players that hold a lifetime of songs, self-healing batteries, and chips that can change abilities on the fly. Based on what's going on in America's research laboratories, these things are not only possible, but likely.
 * NIST's David Seiler: What is "far-out fantasy" today will soon become commonplace.

"The next five years will be a very exciting time for electronics," says David Seiler, chief of the semiconductor electronics division at the Department of Commerce's National Institute of Standards and Technology (NIST) in Gaithersburg, Md. "There will be lots of things that today seem like far-out fantasy but will start to be commonplace."
 * In this two-part series, we'll take you on a tour of what could be the future of electronics. Some of these ideas may sound fantastic, others simply a long-overdue dose of common sense, but the common thread is that they have all been demonstrated in the lab and have the potential to become commercially available products in the next five years or so.

In terms of education, the pace of technological change is not slowing down. Instead, technology and its accompanying changes are creating increased pressure for major educational changes.

Moore's Law
Feldman, Michael (6/29/2011). Moore's Law meets exascale computing. HPC Wire. Retrieved 7/6/2011 from http://www.hpcwire.com/hpcwire/2011-06-29/moore_s_law_meets_exascale_computing.html. Quoting from the article:


 * There are no exascale supercomputers yet, but there are plenty of research papers on the subject. The latest is a short but intense white paper centering on some of the specific challenges related to CMOS technology over the next decade and a half. The paper's principal focus is about dealing with the end of Moore's Law, which, according to best predictions, will occur during the decade of exascale computing.
 * And unfortunately there is currently no technology to take the place of CMOS, although a number of candidates are on the table. Spintronics, nanowires, nanotubes, graphene, and other more exotic technologies are all being tested in the research labs, but none are ready to provide a wholesale replacement of CMOS.
 * And unfortunately there is currently no technology to take the place of CMOS, although a number of candidates are on the table. Spintronics, nanowires, nanotubes, graphene, and other more exotic technologies are all being tested in the research labs, but none are ready to provide a wholesale replacement of CMOS.

Prediction for 2018 Supercomputer
Shah, Agam (6/20/2011). SGI, Intel plan to speed supercomputers 500 times by 2018. Computerworld. Retrieved 6/22/2011 from http://www.computerworld.com/s/article/9217763/SGI_Intel_plan_to_speed_supercomputers_500_times_by_2018. Quoting from the article:


 * Silicon Graphics International hopes by 2018 to build supercomputers 500 times faster than the most powerful today, using specially designed accelerator chips made by Intel, SGI's chief technology officer said.




 * Chips based on the MIC [many integrated cores] architecture mix standard x86 cores with specialized cores to boost high-performance computing. Today's fastest supercomputers top out at around 2.5 petaflops (2.5 thousand trillion calculations per second), but efforts to improve throughput and on-chip performance are under way. IBM, for example, said it will use pulses of light to accelerate data transfers between chips. These and other measures could lead to supercomputers that can deliver performance of over one exaflop, or 1000 petaflops, by 2020.

Comment by Dave Moursund:
 * There are many problem areas that benefit from the steadily increasing power of supercomputers. At the research frontiers, researchers are already primed to make use of this increasing power. Such researchers and their problems are a driving force for still more powerful computers. However, our general educational system at the precollege level and lower division college levels has not yet adjusted to the fact that very fast computers are a powerful aid to problem solving in many of the disciplines that students are required to study or want to study.

Growth in World's Collection of Data
Mearian, Lucas (6/28/2011). World's data will grow by 50X in next decade, IDC study predicts. ComuterWorld. Retrieved 6/29/2011 from http://www.computerworld.com/s/article/9217988/World_s_data_will_grow_by_50X_in_next_decade_IDC_study_predicts. Quoting from the article:


 * In 2011 alone, 1.8 zettabytes (or 1.8 trillion gigabytes) of data will be created, the equivalent to every U.S. citizen writing 3 tweets per minute for 26,976 years. And over the next decade, the number of servers managing the world's data stores will grow by ten times.


 * The IDC study predicts that overall data will grow by 50 times by 2020, driven in large part by more embedded systems such as sensors in clothing, medical devices and structures like buildings and bridges.


 * The study also determined that unstructured information - such as files, email and video - will account for 90% of all data created over the next decade.

Data Storage
Fox, Tiffany )6/1/2011). New directions in data storage solutions. UC San Diego News Center. Retrieved 6/7/2011 from http://ucsdnews.ucsd.edu/newsrel/general/06-01-11cmrr.asp. Quoting from the article:


 * One day in the not-too-distant future, the entire contents of the Library of Congress might be stored on a device the size of a postage stamp. …


 * It’s an evolution that has taken place with remarkable speed, with the density of hard drives increasing by a factor of about 5,000 in only the past 15 years -- from 100 megabits (100 million bits) per square inch to about 500 gigabits (500 billion bits) per square inch. Meanwhile, the cost per bit has dropped by a factor of 5,000, from $5 per megabyte to less than one-tenth of a cent per megabyte.


 * Siegel puts it another way: “IBM’s first disk drive, built in 1956, was the size of two refrigerators, cost more than $50,000 and held a total of 5 megabytes,” which is only enough storage capacity to handle about 30 seconds of broadcast-quality video. “Now you can buy a three-terabyte hard drive for $130 on Amazon.com” and have enough capacity to store the equivalent of an entire academic research library. …


 * He says that it’s not clear that either approach will work, but overcoming these initial hurdles in recording data at the nanoscale is a necessary step if computer engineers are to achieve the final frontier: Storing data near the atomic or molecular level. According to IBM, such a storage capability would enable nearly 30,000 feature length movies or the entire contents of YouTube – millions of videos estimated to be more than 1,000 trillion bits of data – to fit in a device the size of an iPod.

Faster WiFi
Shankland, Stephen (5/31/2011). Coming to a network near you: Faster WiFi. Retrieved 6/3/2011 from http://news.cnet.com/8301-30685_3-20067363-264.html. Quoting from the site:


 * Wi-Fi, the marketing-friendly term for the 802.11 family of wireless networking standards, got its mainstream start with 802.11b with a data-transfer speed of 11 megabits per second. Next came 802.11g at 54Mbps, then the present fastest standard, 802.11n with a top speed of 450Mbps.


 * But under development now are two new versions: 802.11ac at 1 gigabit per second and 802.11ad at 7 Gbps. Those speeds are good enough to open up a major new market, wireless streaming video, likely in 2012 or 2013.…

...Thus, it looks like 802.11ac is destined to be the successor to mainstream networking access point technology that people use in homes, businesses, and public Wi-Fi hot spots, said Kelly Davis-Felner, marketing director for the Wi-Fi Alliance. (She uses a different label for 802.11ac, VHT, short for very high throughput.)


 * "It's a big enabler for the digital home," she said. "With 1Gbps for the raw data rate, the benchmark I hear is you can comfortably stream three lightly compressed HD videos at a time."

Educators should note that the faster of these two new standards would make a large difference within school buildings.

Phase Change Memory
USCD (6/2/2011). Phase change memory-based 'Moneta' system points to the future of computer storage. Retrieved 6/3/2011 from http://www.jacobsschool.ucsd.edu/news/news_releases/release.sfe?id=1078.Quoting from the site:


 * A University of California, San Diego faculty-student team is about to demonstrate a first-of-its kind, phase-change memory solid state storage device that provides performance thousands of times faster than a conventional hard drive and up to seven times faster than current state-of-the-art solid-state drives (SSDs). …


 * Swanson hopes to build the second generation of the Moneta storage device in the next six to nine months and says the technology could be ready for market in just a few years as the underlying phase-change memory technology improves. The development has also revealed a new technology challenge.


 * “We’ve found that you can build a much faster storage device, but in order to really make use of it, you have to change the software that manages it as well. Storage systems have evolved over the last 40 years to cater to disks, and disks are very, very slow,” said Swanson. “Designing storage systems that can fully leverage technologies like PCM requires rethinking almost every aspect of how a computer system’s software manages and accesses storage. Moneta gives us a window into the future of what computer storage systems are going to look like, and gives us the opportunity now to rethink how we design computer systems in response.”

Medicine and Technology
Weinstein, Sheryl (5/26/2011). Five new hot spots where medicine and technology will converge. New Jersey Institute of Technology. Retrieved 6/1/2011 from http://www.njit.edu/news/2011/2011-210.php. Five areas are listed and briefly described. Quoting from the article, here is one of the ares:


 * “Point of care health care technologies is the way medicine can be delivered in individual situations ranging from health monitoring to telemedicine. All point of health care solutions depend on patients connecting with healthcare professionals via computers. Treating people this way can be beneficial both as a great cost savings but also from a quality standpoint. Within this mindset, nursing engineering is fast becoming a career of the future. So too are health monitoring, e-health, health care information management for disaster situations and more. In this world of point of care technologies, the US will need to find a way to link to better efforts in Europe and the Far East. All these solutions will also depend on computer hardware and software improvements.

Making Computer Chips More Vertical
Markoff, John )5/4/2011). Intel increases transistor speed by building upward. The New York Times. Retrieved 5/6/2011 from http://www.nytimes.com/2011/05/05/science/05chip.html?_r=2. Quoting from the article:


 * The company has already begun making its microprocessors using a new 3-D transistor design, called a Finfet (for fin field-effect transistor), which is based around a remarkably small pillar, or fin, of silicon that rises above the surface of the chip. Intel, based in Santa Clara, Calif., plans to enter general production based on the new technology some time later this year.


 * Although the company did not give technical details about its new process in its Wednesday announcement, it said that it expected to be able to make chips that run as much as 37 percent faster in low-voltage applications and it would be able to cut power consumption as much as 50 percent.


 * The timing of the announcement Wednesday is significant, Dr. Bohr said, because it is evidence that the world’s largest chip maker is not slipping from the pace of doubling the number of transistors that can be etched onto a sliver of silicon every two years, a phenomenon known as Moore’s Law. Although not a law of physics, the 1965 observation by Intel’s co-founder, Gordon Moore, has defined the speed of innovation for much of the world’s economy. It has also set the computing industry apart from other types of manufacturing because it has continued to improve at an accelerating rate, offering greater computing power and lower cost at regular intervals.

How Smart Is IBM's Watson?
Lucas, Mearian (4/12/2011). IBM's Watson not as smart as you think. But increasing compute power will mean 'smart' products for everyone, MIT prof say. Quoting from the article:


 * Although Watson is a tremendous engineering achievement, there are some things it can't do," said Patrick Henry Winston, a professor and former director of the Massachusetts Institute of Technology's (MIT) Artificial Intelligence Laboratory. "For example, if there was a conference about Watson, Watson couldn't attend. It would have nothing to say about itself. It can't participate in discussions about how it works."...


 * Winston pointed out that after computer scientists, such as James Slagle, began producing A.I. programs in the early 1960s, the scientific community and the public believed computers would have general intelligence within a few years. That didn't happen.


 * "Apparently what we forgot or overlooked is the idea that it's much harder to produce programs that have common sense than it is to produce programs that behave at expert levels in very narrow technical domains," he said.

The last part of the above quote is a key idea in a good education. It is much harder, and much more valuable, to teach for common sense than it is to try to teach humans to do well in areas that computers can do very very well.

Intelligent Microscopy
European Molecular Biology Laboratory Press Release 1/23/2011. Retrieved from http://www.embl-hamburg.de/aboutus/communication_outreach/media_relations/2011/110123_Heidelberg/index.html. Quoting from the press release:


 * The sight of a researcher sitting at a microscope for hours, painstakingly searching for the right cells, may soon be a thing of the past, thanks to new software created by scientists at the European Molecular Biology Laboratory (EMBL) in Heidelberg, Germany. Presented today in Nature Methods, the novel computer programme can rapidly learn what the scientist is looking for and then takes over this laborious and time-consuming task, automatically performing complex microscopy experiments when it detects cells with interesting features.


 * Called Micropilot, the software brings machine learning to microscopy. It analyses low-resolution images taken by a microscope and, once it has identified a cell or structure the scientists are interested in, it automatically instructs the microscope to start the experiment. This can be as simple as recording high-resolution time-lapse videos or as complex as using lasers to interfere with fluorescently tagged proteins and recording the results.

Comment by Dave Moursund: This article illustrates a very important idea. Repetitious tasks—even if they involve some thinking skills—lend themselves to being automated. The article gives an example of the automated microscope doing in four nights of unattended operation what would have taken a human a month to accomplish. The article does not provide details of the cost of an automated version of the microscope—but we can surmise that the automated machine is cost effective relative to an experienced (human) microscopist.

A Marriage of Computer Memory and Processors
Markoff, John (2/28/2011). Remapping computer circuitry to avert impending bottlenecks. The New York Times. Retrieved 3/2/2011 from http://www.nytimes.com/2011/03/01/science/01compute.html?_r=2. Quoting from the article:


 * PALO ALTO, Calif. — Hewlett-Packard researchers have proposed a fundamental rethinking of the modern computer for the coming era of nanoelectronics — a marriage of memory and computing power that could drastically limit the energy used by computers.




 * To distinguish the new type of computing from today’s designs, he said that systems will be based on memory chips he calls “nanostores” as distinct from today’s microprocessors. They will be hybrids, three-dimensional systems in which lower-level circuits will be based on a nanoelectronic technology called the memristor, which Hewlett-Packard is developing to store data. The nanostore chips will have a multistory design, and computing circuits made with conventional silicon will sit directly on top of the memory to process the data, with minimal energy costs.


 * Within seven years or so, experts estimate that one such chip might store a trillion bytes of memory (about 220 high-definition digital movies) in addition to containing 128 processors, Dr. Ranganathan wrote. If these devices become ubiquitous, it would radically reduce the amount of information that would need to be shuttled back and forth in future data processing schemes.

Comment by Dave Moursund: This and many other future-oriented ICT articles point to continuing success in increasing computer speed and storage capacity. I think of this in terms of Two Brains (Human and Computer) are Better than One and the computer brain steadily improving in capabilities.

Our educational system focuses on improving a student's brain through better education. This is a worthy goal—but it is a losing battle. What we need is an educational system that prepares a person to function well in a rapidly changing world in which rapidly increasing computer capabilities are both a source of much of the change and also an aid to dealing with that change.

More About IBM's Watson
McDougall, Paul (2/24/2011). IBM's Watson boosts global supercomputing effort. InformationWeek. Retrieved 2/28/2011 from http://www.informationweek.com/news/hardware/supercomputers/showArticle.jhtml?articleID=229219358. Quoting from the article:


 * Watson, the Jeopardy-playing supercomputer built by IBM, has inspired tech fans around the world to donate their excess compute cycles to a program that loops the power of individual home PCs into a virtual mainframe.


 * Since Watson bested Jeopardy champs Ken Jennings and Brad Rutter last week, the number of individuals who contribute compute power to the World Community Grid project has increased 700%, according to IBM.


 * "Watson's performance on Jeopardy has captured the imagination of millions of viewers who understand the power of computing to benefit humanity," said Stanley Litow, IBM's VP for Corporate Citizenship and Corporate Affairs.

Comment by Dave Moursund: A computer's "brain" and a human brain are quite different. We are living at a time during which computer brains are becoming more and more capable. From time to time we see examples of this increasing computer brain capability.

Watson's performance in the Jeopardy game illustrates progress in human–machine interaction. We have increasing understanding of AI, increasing compute power, increasing large computerized collections of information, and so on. Eventually it will become routine for ordinary people to draw on the capabilities of such computer systems. Movement in this direction will continue to change the job market. Many people will lose their jobs to machines and to the people who use computers to accomplish information processing tasks that now require paid workers.

This evolving situation certainly should give pause to those who design and implement our educational system. For more about this topic, see:


 * IAE Newsletter - Issue 60, February, 2011.   Assessing Education in an Increasingly Complex, Information-Overloaded World.


 * IAE Blog 2/28/2011. Developing suitable levels of expertise in multiple areas.

Measuring Your Emotions
Naone, Erica (2/24/2011). Computers get in touch with your emotions. Technology Review. Retrieved 2/25/2011 from http://www.technologyreview.com/computing/32429/?a=f. Quoting from the article:


 * Computers could be a lot more useful if they paid attention to how you felt. With the emergence of new tools that can measure a person's biological state, computer interfaces are starting to do exactly that: take users' feelings into account. So claim several speakers at Blur, a conference this week in Orlando, Florida, that focused on human-computer interaction.




 * Hans Lee, chief technical officer of EmSense, a San Francisco company that measures users' cognitive and emotional state for the purpose of market research, says there are plenty of potential applications for a computer that can read a human's mood. "No matter what you do, emotion matters," Lee says.


 * Lee says studies suggest that 40 percent of people verbally abuse their computers. A device capable of recognizing a user's frustration and addressing it could make workers more efficient, and mean fewer broken monitors. "What if your computer could apologize to you?" Lee says.

A Faster Supercomputer to be Ready in 2012
Kenyon, Henry (2/11/2011). [Department of] Energy aims to retake supercomputing lead from China. GCN. Retrieved 2/24/2011 from http://gcn.com/articles/2011/02/11/energy-supercomputer-to-break-performance-records.aspx. Quoting from the document:


 * China currently holds the lead position for the world’s fastest supercomputer, but not for long. The U.S. is working on a new class of computers that will greatly outperform all of the planet’s current supercomputers. These machines themselves will pave the way for even faster computers scheduled to appear by the end of the decade.


 * Commissioned by the Energy Department’s Argonne National Laboratory, the computer will be able to execute 10 quadrillion calculations per second, or 10 petaflops. Nicknamed Mira, the machine will be built by IBM and based on a version of the upcoming version of the firm’s Blue Gene supercomputer architecture, called Blue Gene/Q, Computerworld reported. The supercomputer will be operational in 2012.


 * According to Computerworld, the 10-petaflop performance will be vastly higher than today’s most powerful machine, the Tianjin National Supercomputer Center’s Tianhe-1A system, which has a peak performance of 2.67 petaflops.

Some Intel Projects
Shah, Agam (2/21/2011). Intel aims to reshape chips for next-gen mobile devices. ComputerWorld. retrieved 2/23/2011 from http://www.computerworld.com/s/article/9210538/Intel_aims_to_reshape_chips_for_next_gen_mobile_devices. Quoting from the article:


 * Looking into its crystal ball at where trends are leading, Intel hopes to bolster on-chip capabilities to vastly improve the security and functionality of mobile devices like smartphones and tablets.


 * The company is looking to implement specialized graphics accelerators and hardware layers to secure mobile devices, company executives said. Intel is also laying plans to integrate sensors and accelerators to measure temperature or quality of air, or check speed, distance traveled and location.

A Video from Corning
Corning (2011). A day made of glass... Made possible by Corning. Retrieved 2/20/2011 from http://www.youtube.com/watch?v=6Cf7IL_eZ38. This is a 5:33 video of the future of glass in your life as seen by Corning. It shows many "cool" ideas.

Computer Versus Humans in Game of Jeopardy
Sutter, John D. (2/7/2011). Behind-the-scenes with IBM's 'Jeopardy!' computer, Watson. CNN. Retrieved 2/8 2011 from http://www.cnn.com/2011/TECH/innovation/02/07/watson.ibm.jeopardy/index.html?hpt=C2. Quoting from the article:


 * His name is Watson. He's bad with puns. Great at math. And, next week, he will compete on the game show "Jeopardy!" against real, live, breathing, thinking humans. …


 * CNN: In learning about Watson, what surprised you the most?


 * Baker: I guess what surprised me the most was Watson's speed, because, when it started out, Watson took two hours to answer a "Jeopardy!" question, often incorrectly, and by the end they got it down to two to three seconds. You have to see this thing.


 * Do you see this as a machines versus humans moment?


 * That's kind of a contrivance because it makes for good TV. And it shows how far the machine has come, which is really impressive. But in the workplace, there's no sense in letting a machine work so hard to try to understand puns and things like that. People can do that. And they can benefit from what the machine does best, which is find stuff in massive troves of data -- find answers.


 * Some people might look at this and be fearful -- as they see a computer that's getting smarter and smarter. Is there any part of you that's afraid of Watson?


 * I think there's a concern for jobs that this type of computer is going to take away, because if you have a job that's based on understanding English and then searching for answers, if you work at a telephone help line for example, this type of machine is going to elbow you out pretty soon.

Comment by Dave Moursund: In my opinion, win, lose, or draw, this is a defining moment in certain aspects of artificial intelligence and education. Many people believe that the ability to do well in the game of Jeopardy is a sign of intelligence. The hardware and software designed by IBM for use in this game-playing activity provides an indication of current capabilities and limitations of computer systems in an area that most of us believe that we "sort of" understand. We can watch the Jeopardy game on TV and mentally compete against the contestants. We can feel good about ourselves when we do well. How will we feel when the "Watson" computer performs better than we are able to do?"

School Days in the Future
Barseghian,Tina (1/17/2011). Future school days encourage exploration. Mimd/Shift. Retrieved 1/18/2011 from http://mindshift.kqed.org/2011/01/future-school-day-encourages-exploration/.Quoting from the article:


 * A vision of the school day of the future from Curtis Wong, principal researcher at Microsoft focusing on interaction, media, and visualization technologies. Wong has authored more than 45 patents pending  in areas such as interactive television, media browsing, visualization,  search, gaming and learning. …


 * Whenever I think about what a school of the future would be like, I remember the first time I visited the Vivarium Project Open School in Los Angeles over 20 years ago. It was conceived by Alan Kay and was exploring some new ideas around the classroom, the role of teachers and the potential impact of networked computers among other ideas in the ecosystem of learning.

Vision Technology
Lohr, Steve (1/1/2011). Computers that see you and keep watch over you. The New York Times. Retrieved 1/3/2011 from http://www.nytimes.com/2011/01/02/science/02see.html?_r=1. Here are three quotes from the article:


 * High-resolution, low-cost cameras are proliferating, found in products like smartphones and laptop computers. The cost of storing images is dropping, and new software algorithms for mining, matching and scrutinizing the flood of visual data are progressing swiftly.


 * … computer vision is moving into the mainstream. With this technological evolution, scientists predict, people will increasingly be surrounded by machines that can not only see but also reason about what they are seeing, in their own limited way.


 * Millions of people now use products that show the progress that has been made in computer vision. In the last two years, the major online photo-sharing services — Picasa by Google, Windows Live Photo Gallery by Microsoft, Flickr by Yahoo and iPhoto by Apple — have all started using face recognition. A user puts a name to a face, and the service finds matches in other photographs. It is a popular tool for finding and organizing pictures.

Links to Other IAE Resources
This is a collection of IAE publications related to the IAE document you are currently reading. For the most part it provides links to general categories of material available on the IAE sites.

IAE Blog
All IAE Blog Entries.

Popular IAE Blog Entries.

Possible futures of Science, Technology, Engineering, and Math (STEM) education.

Forecasting the future: good education helps prepare students for their possible futures.

IAE Newsletter
All IAE Newsletters.

IAE-pedia (IAE's Wiki)
Home Page of the IAE Wiki.

Popular IAE Wiki Pages.

I-A-E Books and Miscellaneous Other
David Moursund's Free Books.

David Moursund' Learning and Leading with Technology Editorials.

Author
This page was created by and is maintained by David Moursund.