On MAY 2, 2012, some would argue, a new age in higher education officially dawned. Harvard University and the Massachusetts Institute of Technology announced they would each put $30 million into a new partnership, EdX, offering free online courses by some of their leading professors. More crucially, they would provide at low cost, to anyone who demonstrated a mastery of the course material, certificates of completion bearing one or another of the universities’ names.
The willingness of Harvard and MIT to put their lustrous imprimaturs on MOOCs — massive open online courses — suddenly gave these certificates inestimable value and raised the stakes for all institutions offering education over the Internet. The move validated an earlier decision by Stanford to certify courses offered by the start-up Udacity, founded by computer science Prof. Sebastian Thrun, and prompted the top tier of American universities to put their faculty stars on computer screens around the world. The University of California, Berkeley made EdX a threesome. Online course offerings by Coursera, a private company founded by Stanford computer scientists Daphne Koller and Andrew Ng, grew to more than 200 from 33 universities, including Stanford, Princeton, Columbia, Brown, the Universities of Pennsylvania and Michigan, and Ohio State. Schools that held back felt the heat; the University of Virginia’s initial reluctance fueled its Board of Visitors’ short-lived attempt to fire President Teresa Sullivan.
Few contend that the prestigious new course certificates will diminish competition for a four-year degree from an Ivy League or comparable university, even one with an annual sticker price of $40,000 and up. Indeed, these schools’ added international exposure could increase demand, some say. Many educators stress that no MOOC, no matter how interactive the program or how active the online community surrounding the online course, can replace the benefits of in-person, on-campus education. Face-to-face teaching has “a vibrancy and a value and intense mentoring that these huge courses cannot provide,” says Ben Shneiderman, founding director of the University of Maryland’s Human-Computer Interaction Lab. “Those who proclaim the death of the university only reveal how shallow is their understanding of the educational process.”
But what the trend will mean for schools below the topmost rank – including many state and land-grant universities that educate a large proportion of American engineers — is far less obvious. Some observers see a threat to the cash flow of institutions that don’t get in on the trend quickly or don’t rank at the pinnacle of the academic world. Schools of lesser standing may well find themselves futilely competing with much lower-cost, MOOC-based credentials for increasingly cost-conscious students, undercutting the tuition-based system that has supported higher education for centuries.
Autar Kaw, a mechanical engineering professor at the University of South Florida and a leader in developing online engineering curricula, predicts that “MOOCs will not affect the enrollment and popularity of top schools, as students would still like to get branded . . . but it surely challenges mid-tier schools,” and especially the for-profit sector. “Instructors and administrators at these schools will have to take their game up a notch to make the individual student’s experience worth it to pay thousands of dollars in tuition, room, and board.” Kaw is the author of several courses offered by the nonprofit Saylor Foundation, which, like Udacity, contracts individually with the instructors who present their courses.
A Moody’s Investors Service report cited by The Chronicle of Higher Education also says regional universities that chiefly attract students from surrounding areas could lose market share to stronger universities over the long term as a result of MOOCs. It predicts MOOCs will most hurt the bottom line of low-cost local colleges, primarily commuter campuses and for-profit colleges.
An accumulation of certificates from prestigious schools might serve job applicants better than a degree, Coursera’s Andrew Ng suggests. “If you graduate from a lesser engineering department and you send your résumé to Google,” he says, “it’s difficult to get your résumé noticed. If that student comes and takes a Penn computer course and does well, and takes a Stanford engineering course and does well, and takes a Princeton course and does well, that’s a real way for them to distinguish themselves.” People have already gotten jobs based on Udacity certificates, Thrun says. “We found some really amazing people…. There’s a lot of talent that doesn’t go to Stanford and MIT, talent in the developing world.” MOOCs, he believes, offer able individuals everywhere a revolutionary opportunity for top-flight education.
While the completion certificates won’t count as course credit toward a degree at the top schools, many observers suspect that at least some will end up being transformed into credit toward legitimate degrees at other, less prestigious institutions. Saylor, for example, has already devised, in conjunction with a company called StraighterLine, a system that lets students cheaply turn passing grades on exams in Saylor’s MOOCs into degree credit at a number of accredited, though non-elite, American colleges.
Giving away content free or cheaply on the Internet has already weakened the financial foundations of newspapers and magazines, postal mail, retail sales, bookstores, and movie distribution, and is now threatening book publishing. The university appears to be the next major institution that the Internet will transform. Institutions thus need to adopt MOOCs “with their eyes open,” warns Berkeley engineering dean Shankar Sastry. “If a university decided to put its entire curriculum for a bachelor’s online in a nondiscriminatory fashion, I think that they could put themselves in the situation that a lot of newspapers did by handing out everything free… If you do go to the university for the credentialing [or even] for the ‘ecosystem,’ is it worth whatever X thousand dollars it costs to do that?” The revolution currently involves instructional models, Sastry says, but it “could eventually be in business models.”
Thrun argues that this disruption need not be harmful. MOOCs provide universities “a mechanism by which they can reduce their costs and reach more students. How can that be bad?” he asks. “If distribution becomes so much more inexpensive, we should all celebrate this.”
No one has demonstrated a self-sustaining business model for MOOCs. Those operating outside universities not only are tapping highly paid instructors but also, Kaw notes, are employing “artists to develop presentation materials and software and learning science experts to develop state-of-the-art assessment techniques.” Both Ng and Thrun, whose enterprises are currently buoyed by venture capital, foresee the possibility of their companies acting as employment services, collecting fees from employers in exchange for introductions to high-achieving students who have expressed an interest in being connected with firms seeking their skills. As Ng explains: “If [students] ask us to introduce them to recruiters from Google, Facebook, or other companies, that would be great for the student and great for the companies.” Other funding possibilities, Kaw says, include support from foundations, selling course materials to both online and brick-and-mortar universities, online advertising sales using such vehicles as Google Adsense, and partnering with testing agencies such as Pearson to charge money for official certification.
While Coursera, Udacity, and EdX open up renowned institutions to the multitudes, an earlier entry in online education approaches elite schools from the opposite direction. That company, 2tor, has partnered with the University of Southern California, Georgetown, the University of North Carolina at Chapel Hill, and Washington University in St. Louis to offer online graduate-degree programs for which students pay full tuition. Lately, it has sought to expand its offerings by courting engineering schools. Duke’s faculty turned it down, but negotiations are under way with Penn’s engineering school.
Engineering education provided a launch platform for MOOCs when Thrun’s fall 2011 artificial intelligence course at Stanford drew 160,000 online students from 190 countries. Ng says “engineering concepts”—and especially computer engineering — “lend themselves very well” to online instruction and machine grading. “In engineering, a lot of answers are either right or wrong, so for a lot of what we do [in MOOCs] we can test for the correctness of answers… For engineering courses, you can do sophisticated auto-grading. In my machine learning [MOOC], students are asked to write computer programs [and] implement machine-learning algorithms. Their software is then automatically tested to make sure that it generates the right output.”
Udacity and Coursera have been slow to provide a full spectrum of engineering disciplines, an online field where Lehigh, Drexel, and the Rochester Institute of Technology have a head start. But Coursera now has a growing catalogue of computer science and engineering courses, including Fundamentals of Electrical Engineering from Rice University. The website says the Rice course “can be summarized as ‘the hardest course I have ever taken, but I learned a lot.’” Ng suggests civil engineering would work with Coursera’s assessment software as well. “Imagine a student is asked to write a formula for stresses on a bridge,” he says. “We could test that automatically.” The Saylor Foundation offers 11 courses in mechanical engineering, along with associated math and science courses, and is preparing to add more.
MOOCs’ early months have exposed some problems. For instance, Coursera, in an effort to curb reported instances of student plagiarism, has begun requiring students to renew their commitment to its academic honor code each time they turn in an essay.
A big hurdle for MOOCs in broadening engineering offerings is providing lab experiences comparable to what students would get on campus. “Certain labs can be done online, and certain cannot,” Thrun admits. “For computers, you can do it online because you’re just moving keystrokes up and down. Chemistry is much harder to do online. But we don’t have do to everything online,” he says. Anant Agarwal, president of EdX and director of MIT’s Computer Science and Artificial Intelligence Laboratory, agrees that in certain engineering and science fields, providing experimental opportunities online is “certainly a challenge.” Yet he sounds determined to meet it. “Eventually we want to cover all courses in all areas — humanities, medicine, engineering, STEM — the whole thing,” he says.
“We are very interested in creating virtual laboratories in all the engineering disciplines… It will definitely not be the same experience, but it will be a good experience,” Agarwal says. For a prototype MOOC on circuits that he gave, “our team created an online circuits and electronics laboratory that is extremely compelling,” he says. “In a real circuits laboratory, things keep breaking. They’re expensive. Students don’t get a whole instrument all to themselves. In a virtual laboratory, everybody gets their own laboratory; they get instruments that don’t break.” Or, he adds with a laugh, to “make our simulations more realistic, it is easy to imagine an environment where we have instruments that break… With simulation we can do anything… One can imagine students using simulation in the virtual world, and then there’s some part of the curriculum where they do some real lab work.”
In fact, Agarwal sees virtual labs as one part of a transformation he calls “the gamification” of learning. “All of the learning experience can be made much more fun, just like a video game.” Thrun agrees that “we want to go beyond replication of the classroom online. We want to make it a completely new experience. If you just replicated the classroom experience online, you’d always be worse than the classroom.”
The Saylor Foundation also offers lab supplements to some of its science courses. The measurement and experimentation lab in its mechanical engineering program, for example, includes both hands-on and virtual exercises, according to the foundation’s website. “Although we cannot virtually replicate the lab experience,” it notes in a biology lab course, “this ‘lab’ will familiarize you with scientific thinking and techniques and will enable you to explore some key principles.”
Tom Katsouleas, dean of engineering at Duke, says online courses could enable more students to gain the international experience increasingly required of the 21st-century workforce. Currently, engineering students overwhelmingly stay on their home campuses because their programs are “so structured that if you miss a prerequisite you can be thrown off,” he says. Online courses could allow them to “take their technical engineering requirements while they’re on their semester abroad and not lose time toward the degree.” Overseas programs could then reach outside the classroom, involving students “in an internship or setting up a clinic in Rwanda or building a bridge in Honduras.”
Beyond providing high-level instruction to students not attending college — Ng cites “the poor kid in India, the 40-year-old single mother who cannot take time off” — MOOCs can enrich the on-campus experience, say those involved. “We are not only offering online learning to people around the world — and we want to educate a billion people around the world — but we are also working very hard to reinvent campus education,” says Agarwal. “We believe that we can dramatically change the way we do things on campus.” Berkeley’s Sastry adds that online courses provide “an experimental platform to try to figure out how computer technologies can be used to enhance learning… The ethos here is to develop materials for the people who are here.”
MOOCs also allow teachers to implement what Ng calls the flipped classroom. “Year after year I have walked into the same room and said exactly the same thing,” he explains. “Using the flipped classroom, lectures are put online” for the students to watch outside of class. During class hours, “we instead do small-group problem solving or supplementary materials. I now feel for the first time in many years that I am actually interacting with students instead of talking at them.” The technology also permits just-in-time teaching. “Students do quizzes on the website, and the professor looks at the results and sees where problems are.” says Ng. This way the professor focuses the time on what is confusing for the students,” rather than just giving a standard lecture.
It’s too early to predict whether MOOCs’ biggest impact will be in vastly expanding access to high-quality educational opportunities for people everywhere or in changing the economics of higher education. But it’s clear they are likely to have effects that no one can now even envision. “Technological revolution is always followed by social change,” Sastry says. “We live very differently now because of cellphones. We may have a different business model for our universities after another five or 10 years.”
Of course, technology has shaken up education before. History suggests resistance won’t work. Socrates discouraged students from taking notes, believing it would weaken the mind, Katsouleas reminds us. “Ironically, we know he said this, because his student, Plato, wrote it down.”
Beryl Benderly is a Washington-based freelance writer and a fellow of the American Association for the Advancement of Science.
The Central Intelligence Agency knew it faced a problem. The institution that had sent cameras into orbit in 1960 and in the 1970s developed an early insect-size drone was, by the mid-1990s, losing its grip on advancing technology. Overwhelmed with documents and data from an exploding Internet, the agency watched as adversaries eroded America’s edge in innovation and as venture capital lured bright graduates toward Silicon Valley start-ups and away from government research labs.
The CIA needed to tap emerging talent and spur development of useful technology, but it risked turning young entrepreneurs off with contracting rules and security clearance procedures. Its solution: Join the VC world. The agency would set up a private, nonprofit corporation — CIA-financed, but not officially part of the U.S. government — to serve as a middleman (and, in some cases, a matchmaker) between Langley’s spooks and Silicon Valley. Staffed with former high-level CIA technology officers who knew what the agency needed, it would provide capital needed to get promising startups off the ground. The name chosen had an Ian Fleming touch – a combination of Intelligence and Q, the fictional technology chief responsible for 007’s flashy gadgetry. In 1999, In-Q-Tel was born.
Over the past 13 years, IQT, as it styles itself informally, has spent tens of millions of dollars spotting and acquiring promising cutting-edge technologies. It seeks first to tap technology that already has proven commercial uses. “Why start from scratch if there’s already a commercial solution that gets you there?” an IQT official says. In those cases, IQT simply pays the firm a fee to license its software to the CIA or another intelligence agency. But often, it’s looking for fellow investors to join in technology development, and in guiding both existing firms and small start-up ventures toward potentially lucrative contracts with the U.S. goverment intelligence-gathering agencies.
Seal Of Approval
From its headquarters in Arlington, Va., and offices in the high-tech enclaves outside Boston and in Menlo Park, Calif., In-Q-Tel both invests in companies directly and tries to leverage its money by persuading private venture-capital funds and large corporations to become the principal backers. Its support for an innovation acts like a Good Housekeeping Seal — an inviting sign that an entrepreneur is likely eventually to win a CIA contract. Venture-capital firms consider In-Q-Tel a partner rather than a competitor, says Mark Heesen, president of the National Venture Capital Association, the industry’s Washington-based trade group. “Actually, they help cut through some red tape that otherwise might be a problem.”
In-Q-Tel prefers to keeps quiet about specific technologies it snags, even though most early-stage development is unclassified. But it’s known to have invested early in Keyhole Inc., whose software for combining satellite images and maps eventually led to the development of Google Earth. It also funded Perceptive Pixel Inc., creator of touch-screen technology; ThingMagic, producer of radio tracking chips; Sonitus Medical Inc., which converted a hearing aid into a two-way radio that can send and receive voice traffic through a tooth; and Analytic Solutions, which designs software to help pinpoint potential criminals online. Other funded technology can enable people to use lasers to manipulate holograms on a computer monitor.
In-Q-Tel also won’t discuss its budget and finances, but the federal tax return that it must file each year as a nonprofit suggests that it receives more than $56 million a year in federal funds — almost all of it from the CIA. This money goes toward operating expenses and a $180 million fund from which In-Q-Tel draws about $35 million a year for investments and fees. In a typical year, the firm will work with 75 to 80 companies.
In-Q-Tel staffers scout promising technology by staying in constant touch with start-ups, universities, and others in the high-tech community. Over the years, they have reviewed some 10,000 business proposals, which can be sent to the firm’s website (www.iqt.org), and put together hundreds of investment plans involving private venture-capital funds, start-up companies, and, of course, intelligence agencies. In all, In-Q-Tel has brokered deals involving more than 180 start-up companies, usually capping its own stake in each at $1 million to $3 million.
The company typically doesn’t insist on blanket rights to the intellectual property that entrepreneurs create. More often, the investment involves dual-use technology — for commercial users and for the CIA — with separate licensing for each version. In exchange for guaranteeing a government market for new technology, In-Q-Tel demands a seat as an observer on the board of directors of the companies it supports, giving it some influence over products and how a firm conducts business. And it oversees any technological changes that the CIA wants.
IQT loses money on some investments, but reaps a tidy return when a startup succeeds commercially or attracts a deep-pocketed purchaser. Its investments include a credible list of financial successes. Perceptive Pixel has provided the lucrative basis for touch-screen technology for tablets and smart-phones, while Keyhole, Inc., was acquired by Google.
While enabling start-ups to avoid the usual federal contracting hurdles, I-Q-Tel itself is exempt from federal personnel and salary regulations, which proponents say is necessary to attract managers from the high-tech world. President and CEO Christopher Darby, who was an executive of Intel Corp. and several other high-tech firms before joining In-Q-Tel in 2006, gets about $1 million a year in salary and benefits. He leads a corporate management team, and a strategic investment team comprising investors, business executives, financial analysts, and business leaders with experience working for — and dealing with — high-tech companies.
As a nonprofit, In-Q-Tel has an independent board of trustees. They include high-tech executives and academics, among them Anita K. Jones, professor emeritus of computer science at the University of Virginia; Elisabeth Paté-Cornell, chair of the department of management science and engineering at Stanford University; and Charles M. Vest, president of the National Academy of Engineering and president emeritus of MIT.
By now, In-Q-Tel’s client base reaches well beyond the CIA to include the National Geospatial-Intelligence Agency, the National Security Agency, the Defense Intelligence Agency, and the Department of Homeland Security, which in turn help fund IQT. It has also provided a model for NASA and the Army, which have set up similar programs.
So far, In-Q-Tel has encountered relatively little criticism, partly because its secrecy shields it from public scrutiny and partly because it has avoided scandals. When a newspaper article lambasted its lucrative investment program for staff a few years ago, it scrapped the plan and substituted a less generous system.
And the organization has weathered an initial round of congressional skepticism. An assessment of its business model by Business Executives for National Security, an association of defense contractors, gave it high marks. So did a detailed 2005 case study by Josh Lerner, a professor at the Harvard Business School. The notion of creating a private corporation to bring cutting-edge ideas to the attention of the intelligence community was a “very powerful idea” that has worked well, Lerner says.
A continuing IQT quest is for new ways to manage the immense data stream from the Internet and wireless communications. Intelligence agencies not only collect information but also pick out what matters, analyze it, and share it with those having a need to know. The community’s failure to “connect the dots” is widely seen as a key reason why the United States was caught unprepared by the Sept. 11, 2001, terror attacks. “We sympathize with the working-level officers, drowning in information and trying to decide what is important or what needs to be done when no particular action has been requested of them,” wrote the bipartisan commission that probed the attacks. The explosion of social media adds to information overload. The Arab Spring, for example, generated some 190 million tweets each day.
Looking ahead at the exponential growth of communication, CIA Director David H. Petraeus points to the need for “transformational” devices and software that can be interconnected across the electromagnetic software and sense and respond to what they collect. This means a focus on radiofrequency identification, sensor networks, tiny embedded servers, and energy harvesters—using next-generation cloud-computing and supercomputers that can integrate data from closed societies and provide continuous, persistent monitoring of any place on the globe.
Intelligence agencies will also need more sophisticated cloud-computing technology to handle the fast-increasing volume of incoming open-source data — known as Big Data — and enable the intelligence community to store, process, and later access massive amounts of disparate data using parallel IT systems wherever they’re needed. Finally, all the data will have to be processed faster.
Together, these challenges mean In-Q-Tel’s role has probably never been more important. Addressing entrepreneurs, investors, and technologists brought together for an In-Q-Tel “summit” last spring, Petraeus noted, “Industry’s ability to rapidly prototype new products and get them to market…is a skill that government simply cannot match.”
Message to Q: Find yourself a start-up.
Art Pine, a Washington, D.C. writer, covered national security affairs for several major newspapers.
It has been hailed as a wonder material full of suprising, often counterintuitive properties, one of the most versatile materials ever to spring from the engineering lab. Tougher than steel, a whisper-thin coating can superconduct electricity at room temperature, mend itself, and change properties according to whatever it’s wrapped around, including metal. Corrosion could disappear “almost like magic,” predicts MIT chemical engineering Prof. Michael Strano.
Isolated just eight years ago by researchers at Britain’s University of Manchester, this single-layer lattice of carbon atoms has fast become the stuff of legend. Its potential seems boundless, from sensors that detect mere molecules of poison gas to carbon fiber replacements and even the next Teflon. Thanks to its ability to conduct electricity some 100 times as fast as silicon, the foundation of integrated circuits, it could come to define 21st-century computing.
Are claims for graphene overblown? Mark Hersam, a professor of materials science and engineering at Northwestern University, is “worried about a graphene bubble.” Still, he notes, with so many wide-ranging potential applications, “if only one or two pan out, that would justify the excitement.” More bullish is Leonid Ponomarenko, a physicist at Manchester who works closely with the two physicists who won a Nobel Prize for isolating graphene, Andre Geim and Konstantin Novoselov. “No other material has such a diverse combination of properties,” says Ponomarenko. “And that’s what makes this material unique.”
Graphene actually encompasses a “huge variety of materials that have emerged and continue emerging,” as Geim noted in an email to Prism. These include graphene in many layers with varying properties, as well as graphene cousins, such as graphene oxide, graphane, and fluorographene. “If graphene does not work for a particular project, the counterparts come to the rescue,” explains Geim. “This assures not a single, not a hundred, but a cornucopia of graphene incarnations. Just as with plastics, one name covers many items.”
Even in pure form, graphene has properties as varied as they are contradictory. The lightest material yet discovered, it’s highly flexible and stretchable. But it’s also the strongest material on Earth — harder than a diamond and some 200 times as sturdy as steel. Graphene is the world’s most effective conductor of heat, more transparent than glass, and the thinnest material there is, though nothing is more impermeable. These diverse properties give engineers a lot to play with, which is why graphene is expected to revolutionize many industries, including electronics, energy, medicine, transportation, and aeronautics.
Ironically, this miracle material springs from mundane stock: graphite, or common pencil lead. Graphene’s structure was first theorized in 1947 but thought not to exist in nature. Then, in 2004, Geim and Novoselov took a small chunk of graphite and began peeling it apart, layer by layer, using cellophane tape. Eventually, they whittled down to a single, one atom-thick layer that, when viewed under an electron microscope, looked like a sheet of carbon atoms arranged in a neat, repeating honeycomb pattern, like chicken wire. The discovery kick-started a science and engineering revolution. In 2005, just three published papers discussed graphene; by 2010, that number had soared to 3,000.
The first big commercial uses of graphene are most apt to be in high-value goods that can accommodate the cost of the material and don’t necessarily require pristine versions. Likely candidates include touch-screens, conductive ink for electronic printing, batteries and capacitors, flexible electronic devices, and solar cells. Yun Hang Hu, an associate professor of materials science and engineering at Michigan Tech, has mixed graphene with titanium dioxide to make dye-sensitized solar cells that are 52.4 percent better at converting light into electricity than current versions. “It is a very simple procedure, and commercially it would be easy to do,” Hu says. Graphene might also some day revolutionize fiber-optic telecommunications. Engineers at Columbia University, working with researchers at Singapore’s Institute of Microelectronics, recently created a graphene-silicon hybrid device that eventually could be used for ultralow-power, on-chip communications. Adding graphene crystals transforms crystals of silicon from a passive material into one that can generate microwave photonic signals and perform the conversion of photonic wavelengths at levels needed for faster, more efficient telecommunications. The material is also optically nonlinear, which means it can be switched on and off, a property required for digital telecommunications.
Graphene must overcome two major hurdles before claiming superstar status, however. First, there is still no way to mass-produce graphene economically. Second, graphene conducts electricity so well that there’s no way to turn off the current, which hampers its use in integrated circuits. Like most conductors, graphene lacks a band gap that separates the conductor and valance bands. Electrons move freely through them but can’t be switched off, a function that is crucial to the binary operations of computers. Moreover, a chip laden with graphene transistors packed tightly together would leak so much current that it would melt almost instantaneously.
Dry Ice & Stainless Steel
The most pristine graphene still comes from micromechanical exfoliation – otherwise known as the Scotch tape method – which is hardly practical for industrial-scale production. Another method sprays a carbon-based chemical vapor on a substrate, usually copper, under high temperatures. While the quality is not bad, “it is expensive and difficult to grow in large areas,” says Junhong Chen, a mechanical engineering professor at the University of Wisconsin, Milwaukee.
Breakthroughs may loom, however. Researchers at Case Western Reserve University and South Korea’s Ulsan National Institute of Science and Technology claim their method can produce large quantities of quality graphene cheaply. They mix graphite and dry ice in a ball miller canister containing stainless steel balls. The resulting flakes are dispersed in a solvent, where they form sheets of graphene. The flakes and a solvent can also be placed on a piece of silicon and then heated to produce a thin film of graphene. Liming Dai, a professor of macromolecular science and engineering at Case Western, says the resulting graphene “is almost as conductive” as pure graphene made with sticky tape. He’s convinced the method can yield large amounts of graphene in pieces of up to 10 to 14 inches, perfect for computer touchscreens. “Ball milling is an inexpensive process, and we don’t use expensive chemicals,” Dai adds.
There’s still the problem of introducing a band gap for use in computer chips. Possible solutions include a chemical treatment to create an insulating gap. The so-called Hummer’s method, first developed in 1957, uses harsh acids on graphite to produce the insulator graphene oxide, or GO. A second chemical bath changes the GO to graphene, though in a version that’s less conductive because the acids damage the delicate lattice of carbon atoms. More recently, Hersam’s group at Northwestern has oxidized graphene in an ultrahigh vacuum chamber containing a superhot tungsten filament. Researchers pump oxygen into the chamber, and the heat splits the molecules into oxygen atoms that insert themselves into the graphene lattice. The resulting material is highly homogenous, and the oxidation process is reversible. Further tests will prove if it has opened up a sufficient energy gap, and if it preserves the flow of electrons. “It’s a significant advance chemically, but it remains to be seen if it’s an application advance,” Hersam says.
Meanwhile, Chen’s group at Wisconsin has invented something it calls GMO, or graphene monoxide. When the team members first heated GO in a vacuum, they anticipated it would destroy the oxygen. Instead, the carbon and oxygen atoms realigned themselves in an ordered pattern, creating a carbon oxide not found in nature that could function as a semiconductor. A computer model indicated it has a band gap of 0.9 electron volts. “That’s getting close” to silicon’s 1.1 eVs, Chen says, adding that GMO “has a high sensitivity to strain, so we might be able to further engineer the gap” by applying force to induce a gap-altering deformation.
Back in Manchester, Ponomarenko, Geim, and Novoselov are trying a very different approach. They’ve created a vertically stacked graphene transistor that’s built like a layer cake, with a one-atom-thick insulating layer – either boron nitride or molybdenum disulfide – sandwiched between two layers of graphene, with the insulator serving as the band gap. The geometry works, Ponomarenko says, but researchers still must check if the sandwich-generated band gap remains intact when the transistor is reduced to the nanoscale level. “It should be much more efficient than silicon [transistors],” he says. The higher a transistor’s frequency, the faster it runs. Companies including IBM and Intel have built radio-frequency (RF) chips with graphene that have frequencies up to 300 GHz. But RF chips don’t need a proper “off” state, unlike those for digital applications. And commercially producing graphene in sandwich-like layers won’t prove easy.
Superthin graphene nanoribbons are another possible solution to the band-gap dilemma, since their extreme narrowness restricts the energy values an electron might acquire, thus introducing an energy gap. At the University of Bath, where Exeter University operates Britain’s Center for Graphene Science, researchers are looking at nanoribbon self-assembly. For graphene nanoribbons to work in transistors, though, their edges have to be perfect, because any flaws cause the electrons to scatter. And since one chip can contain billions of transistors, there are doubts that countless, perfect nanoribbons can be produced. “That’s a tall order, in my opinion,” Hersam says. “Entropy wants defects.”
Adelina Ilie, a physics lecturer at Bath, notes that “graphene is important for applications and devices not yet invented, not just improving existing ones.” Her research focuses on inventing potential medical devices, including monitors that can stretch, flex, and be worn on the skin. To construct such devices, Bath’s lab has developed a specially adapted scanning probe microscope — dubbed a nanofactory — that has a tiny “stencil” attached so researchers can spray molecules onto graphene in various patterns. “Graphene is the quintessential material for doing nanoscience,” Ilie says, because molecules large and small readily attach to it.
If graphene were made to work in integrated circuits, it could usher in a new era of chips many times faster than today’s. “I hope that happens, but it will be hard,” Hersam acknowledges. “I’m not placing any bets on it.” But then again, he’s not betting against graphene, either.
Thomas K. Grose is Prism’s chief correspondent, based in London.
In the movie Independence Day, alien spaceships larger than Manhattan hover above the world’s major cities, emitting giant blue beams that spread death and destruction. In Albert Segall’s estimation, the weapons are superfluous. Given the downward force exerted by the stationary spaceship—a force equal to the total weight of the craft – the inhabitants would be crushed anyway. If that sounds morbid, it gives first-year engineering students an unforgettable lesson in static equilibrium and pressure. Another scene from the 1996 film depicts terrorized earthlings finding shelter from aliens’ plasma guns. Again, Segall delivers bad news, this time by way of conductive and radiative heat-transfer concepts. The victims’ supposed refuge has a metal door, which the beams will heat to 1,900 degrees Centigrade in a few minutes. All inside will be “barbecued.”
Segall, an engineering science and mechanics professor at Penn State, has been using scenes from science fiction for more than a decade to teach basic engineering concepts. By pointing out—and then correcting—the scientific and engineering flaws in movies and TV episodes, he hopes to leave students with a lasting mental picture “of the way things function and the complexities of design.” Science fiction’s “potent combination of theory and imagery” not only serves the teaching of core topics but also helps illustrate engineering’s contributions to society and generates a positive image of the field, Segall argues.
In his first-year seminar, which combines both ethics and hard engineering and scientific principles, Segall starts with the concepts he wants students to learn and then selects the right science fiction to get the job done. A great deal of the engineering topics discussed come from dissecting Independence Day, which Segall says “has so many great examples on so many different levels.” The original series of Star Trek, shown on television in the 1960s and still popular as a Web franchise, is another rich source. Principles of dynamics and mass acceleration, for instance, would prevent the spaceship Enterprise from flying smoothly through space. Instead, its lopsided design would cause it to do somersaults.
Segall’s seminar, one of a diverse assortment for freshmen, is popular. “From what I understand, the course is always filled,” says Renata Engel, Penn State’s associate dean of engineering for academic programs. “I think it’s a captivating topic.” She adds, “There’s no shortage of engineering and science content in these subjects.”
In 2002, when Segall described his technique in the Journal of Engineering Education, he lamented that science fiction wasn’t widely used in engineering. If that was the case then, it’s not anymore. Across the country, sci-fi and fantasy, from Star Trek and the original Outer Limits to Doctor Who and Harry Potter, are helping to draw students into engineering and science classes and make the lessons stick.
Trekkies & Tweets
At Syracuse University, many students sign up for TrekClass, based on Star Trek, out of sheer curiosity. Of the class’s 45 to 80 students per semester, only a handful start out as Trekkies—fans—but information studies Prof. Anthony Rotolo tells them, “I can’t promise you won’t be one when it’s over.” Rotolo uses the episodes broadly to teach social media, ethics, and technology, but says, “This class is really designed to try and spark an interest in the STEM disciplines.”
TrekClass uses full episodes of The Original Series (TOS), which first ran between 1966 and 1969, The Next Generation (TNG, 1987-94), and Voyager (1995-2001) to spur discussion, which is accomplished in part by tweeting with each other while students watch. Teaching Assistant—“First Officer”—Meghan Dornbrock leads the conversation and keeps students on track.
In Star Trek, a race of computers called the Borg (short for “cybernetic organism”) takes over the bodies of humans and aliens alike in a quest for galactic domination. Attempts to resist “assimilation” by the Borg Collective can be compared to what some see as a very real struggle against technology and social media invading every aspect of our lives. Rotolo challenges students for their main project in the course to argue in favor of assimilation. This has inspired everything from in-house rap battles to video spoofs. Along the way, students begin to see how sharing thoughts and ideas in real time via Twitter or Facebook is not too different from being assimilated by the Borg.
Several students who took the course have gone on to major in information studies. One, Isaac Brennan Budmen, enjoyed it so much he entered graduate school in the field and now works with Rotolo as his graduate assistant. Computer engineering student Sergio Talavera, who is interested in robotics, says the class helped him better understand how society perceives the engineering profession.
Star Trek seems especially suited for aspiring engineers, providing grist for in-depth explanations of technological advances and the future of society. “Engineers and the people who make technology today were influenced by Star Trek,” says Dornbrock. It may help that there is at least one chief engineer character in each series, with Montgomery Scott from TOS being a favorite of Trekkies.
Technology & its Implications
The series presents an easy segue to teaching engineering ethics. Depictions of future worlds show us the implications for our technology, says Segall. By learning through sci-fi, students witness what could happen when engineers don’t behave ethically and how work done with good intentions can still be misused.
But sci-fi offers many other vehicles. George Plitnik at Frostburg State University in Maryland teaches physics and engineering concepts via the Harry Potter books and movies. Strange-tasting jelly beans—which exist both in the books themselves and as real, released products—become a lesson on engineering flavor and scent, with students asked to taste some and then guess what they are. Flying broomsticks provide a chance to discuss real levitation methods such as using diamagnetism, a magnetic field created by the moving electric charges present in all atoms. When an object is immersed in a strong enough magnetic field, the repulsion can overcome gravity, causing it to hover. Broomsticks also lead to discussions about the flights of hot air balloons (Archimedes’ principle), airplanes (Bernoulli’s principle), and rockets (momentum conservation).
Plitnik also frequently discusses genetic engineering. “Could you combine a human with an animal?” he asks, and, more important, “Would you want to do that?” Could a three-headed dog like Fluffy or a subservient house elf like Dobby be made in a lab? Plitnik says yes, in the near future if not now. But is it morally and ethically acceptable to try? Harry Potter acts as a “hook” to bring up these types of questions, engage students in conversation, and teach basic science concepts at the same time.
Genetic engineering and social implications of other technological advances are also major themes of an English class designed for technology students at DeVry-Pomona in California, in which literature Prof. David Layton is using the time-travelling cult classic Doctor Who.
The entertainment quotient in sci-fi courses deceives some students into thinking they’re easy, and some drop out when they discover otherwise. “I think it’s much more difficult than a number of students think it is when they sign up,” says Rotolo. Educators need to adjust as well. “It’s different than what they’re used to teaching,” Segall explains. Professors engaging students with sci-fi can’t just “shoot equations at them” and “chalk and talk,” conveying “no real understanding of the underlying concepts.” Complicated theories must be broken down, and applications shown clearly, so that students can get a distinct “visual image” of what’s going on. “You’re going to have to break from your traditional mode of engineering education” to teach with science fiction successfully, Segall says. “You have to be open to alternative points of view.”
But above all, “you have to make it fun.” He concludes: “Be yourself.”
Jaimie Schock is assistant editor of Prism.
When MIT announced in 2001 that it would put course materials online, who imagined that before the end of the decade, its OpenCourseWare website would draw 15 million visits a year? The concept “has surpassed our wildest dreams,” Catherine Casserly, director of the Open Educational Resources Initiative at the William and Flora Hewlett Foundation, said in 2009. So be prepared for more stunning surprises from the latest stage of the Internet revolution in education: the decision by America’s elite schools, including MIT, to issue certificates to students who complete their online courses. As Beryl Benderly reports in our cover story, this credential, available for a modest fee, suddenly turns MOOCs – massive, open, online courses – into bankable commodities. While the certificates won’t lead to degrees from the institutions that provide them, they could nevertheless lead to jobs. What all this means for most of America’s engineering instructors will take time to assess. But one thing is already clear to Autar Kaw of the University of South Florida, winner of ASEE’s 2011 National Outstanding Teaching Award and a leader in online education: They will have to “take their game up a notch.”
Equally far-reaching are the uses being explored for graphene, the wonder material derived, in nano form, from the lowly sediment in pencil lead. Able to conduct electricity some 100 times as fast as silicon, harder than a diamond, many times tougher than steel – and yet flexible – graphene has kick-started a science and engineering revolution, Tom Grose writes in our second feature. Graphene has another quality that Prism’s talented designers have exploited: stunning beauty.
For sheer thrills, it’s hard to beat science fiction. An attention-grabbing tool for any instructor, sci-fi is particularly useful in teaching basic engineering concepts, as Jaimie Schock describes in “Fiction and Fact.” One favorite of Penn State Prof. Albert Segall is the movie Independence Day – not because it’s true to life, but because its dramatic episodes can readily be exposed as scientifically implausible.
We hope you’ll enjoy the October Prism, and we would be happy to hear or read any comments.
Two years ago, the Solar Impulse became the first solar-powered airplane to fly 26 hours nonstop. Now the Swiss-made craft has chalked up another record. This summer, it became the first sun-fueled aircraft to complete a trans-Mediterranean flight, covering around 3,100 miles from Payerne, Switzerland, to Rabat, Morocco. It was piloted by former balloonist Bertrand Piccard and André Borschberg, who together conceived the plane in 2003. Although the Solar Impulse has a wingspan equal to an Airbus A340, it weighs only about as much as a standard car. That’s because it’s made from lightweight carbon fiber materials. Some 12,000 solar cells in its wings feed energy to four 10-horsepower motors. Power stored in lithium batteries during the day permits night flying. En route back to Europe, the plane averaged a speed of around 37 mph during a 560-mile leg between Morocco and Spain, and had to slice through even stronger crosswinds. The team now plans to build an updated version of the Solar Impulse to tackle an around-the-world flight in 2014, powered only by the sun’s rays. – Mary Lord
Chocoholics rejoiced when a study earlier this year found that regular consumers of chocolate were, on average, slimmer than those who indulged only occasionally. Still, health experts worry about the amount of sugar and fat chocolate contains. Now chemists at Britain’s University of Warwick have developed a way to make tasty chocolate with only half the fat. Stefan Bon’s team infused dark milk and white chocolate mixtures with tiny drops of orange and cranberry juice while greatly reducing the amounts of cocoa butter and milk fats needed. The chemists used a method called the Pickering emulsion, which kept the droplets of juice from merging with one another. Apparently, the juice-enhanced chocolate tastes and crunches like the traditional stuff because the technique retains the crystal structure that keeps higher-fat chocolate firm and glossy, yet allows it to melt in the mouth. “Our study is just the starting point to healthier chocolate,” Bon says.– Thomas K. Grose
The Department of Defense has long been eager to develop renewable energy sources and pare its $4 billion-a-year utility bill. But the Pentagon also considers it risky to have military installations fully reliant on a commercial grid for power. Today, bases have backup generators, but they’re costly and not environmentally friendly, which is why DOD wants to set up renewable-based microgrids at its installations. Toward that goal, DOD and the Department of Interior have agreed to open up 16 million acres of military land — now managed by Interior’s Bureau of Land Management — for a variety of green energy projects, including solar, wind, geothermal, and biomass. For example, the Pentagon and the bureau want to authorize pilot solar projects at Arizona’s Yuma Proving Grounds and Fort Irwin in California. Around 13 million of those military acres are areas rich in renewable energy resources. – TG
Big Blue Eyes on Kenya
Home to 14 percent of the world’s population, Africa boasts one of the globe’s fastest-growing economies overall. That was one compelling reason why IBM recently opened its 12th global science and technology research lab in Nairobi, Kenya. Another big factor: Much of the African market is based on mobile computing, and Big Blue reckons it “has the potential to become a hotbed of [wireless] innovation for the rest of the world.” To help tackle the high-tech skills shortage in Africa, the lab will set up a resident scientist program for researchers with pre- and postdoctoral backgrounds from across the continent. Nairobi’s population is expected to jump from 3 million to 5 million by 2020, so the lab has also been charged with developing predictive analytics and models to help improve local water and transport systems. The latter could be a tough challenge. An IBM global survey of 15 cities ranked Nairobi as the fourth most congested. – TG
The Knowledge Gene
Venger Wind, a Nevada-based maker of small wind turbines, reached back to a 1922 design by Finnish engineer Sigurd Johannes Savonius as the basis for the 18 vertical axis turbines that together form the country’s largest building-integrated wind farm. Erected on the roof of the Oklahoma Medical Research Foundation (OMRF) in Oklahoma City, the 18.5-foot DNA-shaped turbines are positioned in three parallel rows and can catch both northerly and southerly winds. Each produces 4.5 kilowatts of electricity, and the OMRF expects the turbines to generate 85,500 kilowatt-hours of energy a year — enough to power seven average-size houses. The turbines also should cut carbon emissions by around 2 million pounds a year and save the equivalent of 44,000 gallons of gasoline. Venger’s V2 turbines start producing power at wind speeds of 8.9 mph — a breeze in windy Oklahoma City. – TG
See, Blind Mice!
In a healthy eye, an array of photoreceptors on the retina captures and converts light into a neural code that creates impulses. Those impulses then are sent by ganglion cells to the brain, where the code is translated back into images. Diseases of the retina cause blindness in 25 million people worldwide, but most maladies leave the output cells unharmed. Current retinal devices use electrodes to stimulate those cells, but they allow the blind to see only spots and bits of light. However, in a potentially big breakthrough experiment, two researchers at Weill Cornell Medical College cracked the neural code for mouse retinas and developed a prosthesis that restores nearly normal vision to blind mice. They also recently broke the code for monkey retinas, which vary little from those of humans, and hope to test the technology on people soon. Sheila Nirenberg, a computational neuroscientist, and Stanford postdoc Chethan Pandarinath envision a visorlike device that uses a camera to take in light, and a computer chip to translate the light into the neural code. Patients would also undergo gene therapy to introduce light-sensitive proteins to the ganglion cells. Says Nirenberg: “I can’t wait to get started on bringing this approach to patients.” – TG
Meet Your Makers
A group of engineering and design students, mostly from Stanford University, spent the summer driving an “educational build-mobile” across the country to spread the fun of hands-on learning and show kids how “to find their inner maker.” Dismayed to learn that budget cuts and standardized testing requirements meant that few schools today give kids the chance to build things, they raised a reported $300,000 on the crowd-funding site Kickstarter to outfit a panel truck — dubbed the SparkTruck — with rapid-prototyping tools, including two 3-D printers, a laser cutter, sewing machines, and a clay oven. Parking at schools, libraries, and children’s museums, the students put on workshops for 7- to 13-year-olds to demonstrate what a child’s natural creativity can produce with sophisticated equipment. Meanwhile, a trio of Stanford graduate students — two mechanical engineers and an M.B.A. candidate — have set up Maykah, a company that creates toys designed to inspire girls to become “artists, engineers, architects, and visionaries.” They hope it will help bring more women into the tech workforce, where currently females number just 25 percent. The students raised nearly $86,000 on Kickstarter to bring their first toy to market. Roominate, a miniature DIY house that is “stackable, attachable and customizable,” also includes working circuits. – TG
Oceans cover roughly 70 percent of the Earth. Estimates hold that converting a mere 2 percent of the energy contained in those waters would easily meet the entire world’s electricity needs. The United Kingdom, a leader in marine energy, has 46 tidal and wave energy projects underway, hoping to generate 200 megawatts of marine power by 2020, up from 7.7 MW today. In a U.S. pilot, Ocean Renewable Power is set to submerge five tide-powered electric turbines some 82 feet deep in Maine’s Cobscook Bay. They’ll be linked to shore by an underwater cable and are expected to generate 4 MW of electricity, enough to power 1,200 houses. This month, Ocean Power Technologies, a New Jersey company, plans to anchor America’s first commercially licensed, grid-connected wave-energy buoy some 2.5 miles off the central coast. It has a federal permit for up to 10 generators, enough to power about 1,000 homes. Wave technology is so new, Oregon state’s marine program coordinator Paul Klarin told the New York Times, that the designs are like a curiosity shop—all over the map in creative ways to connect waves to wires. Meanwhile, Australian researchers also are enthusing about marine energy’s potential. With adequate funding, a government-backed report says, Australia could use the seas to meet 10 percent of its power needs by 2050. Of course, like solar and wind energy, wave and tidal power is an intermittent source. But the ebbs are predictably regular, a plus for planners. – Chris Pritchard
It’s estimated that 3 percent of the electricity consumed in developed countries goes to treat wastewater, and much of that power is generated by harmful fossil fuels. But engineers at Oregon State University have developed a greener solution: a microbial fuel cell that produces electricity directly from wastewater while also cleansing it. They believe the process can be scaled up to commercial levels, producing enough electricity not only to power a plant but also to generate excess energy that could be sold to a grid. Researchers have long known that wastewater could provide huge amounts of clean energy, but figuring out how to best tap that potential power has proved difficult. The Oregon State engineers use bacteria to oxidize the biomatter, which produces electrons that jump from the anode to the cathode of a fuel cell to generate an electrical current. With refinements, the cost of the process eventually could be competitive with sludge treatments used today.– TG
The 1937 Hindenburg disaster ended zeppelin air travel. Now, some 75 years after the German craft exploded, the same New Jersey locale is ushering in a return of airships. The U.S. Army’s new hybrid Long Endurance Multi-Intelligence Vehicle (LEMV) completed a 90-minute maiden flight at Lakehurst Naval Air Station this summer. Designed by Britain’s Hybrid Air Vehicles (HAV) and built by Northrop Grumman, the 304-foot spy plane can gather and send images and signals — picked up by multiple arrays of sensors — to ground troops. The helium-filled blimp gets a bit of help from several tiny wings and four diesel engines. It’s also designed to be unmanned; Northrop Grumman says a fleet could be maintained by a handful of ground-support personnel. The LEMV is designed to stay aloft — at 22,000 feet — for three weeks at a cost of $20,000. A single fighter jet surveillance mission runs $10,000 per hour. Wouldn’t a large, stationary airship make a big target? Perhaps, but the helium-air mix isn’t flammable, so the ship wouldn’t explode if shot at. It’s also designed to leak very slowly if punctured, giving remote operators plenty of time to land it safely. – TG
Since its invention by 16th-century Englishman John Harington, the flush toilet has been a powerful public health weapon. But for much of the world, it remains an unimaginable luxury that consumes 10 times as much water as a person’s daily drinking requirements. High plumbing costs and water shortages perpetuate poor sanitation, spreading deadly diseases. Enter the Bill and Melinda Gates Foundation with a competition to reinvent the toilet [Prism, March 2012]. The new loo had to operate without running water, electricity, or a septic system, and cost just 5 cents per person a day. Some 28 designs were submitted. One, from Delft University in the Netherlands, turns poop into electricity using microwaves. The London School of Hygiene and Tropical Medicine’s prototype uses black soldier fly larvae to process waste into animal feed. The winning design came from Caltech. It features a solar-powered electrochemical reactor that turns water and excrement into hydrogen that can be used to generate electricity. Flushed with success, Caltech’s team went home with the $100,000 prize. – TG
In the 1970s, a pseudoscientific bestseller called The Secret Life of Plants argued that plants think and have feelings. It spawned a documentary with a Stevie Wonder score. Now comes Lorna Gibson, an MIT professor of materials science and engineering who makes a convincing case that plants do, indeed, have secrets to reveal — but not of the New Age variety. In a recent paper, Gibson shows that the mechanical properties of plants at the microscopic level are far-reaching and quite marvelous. Plant cells use only four main building blocks: cellulose, hemicellulose, lignin, and pectin. Stiffness or strength is determined by the composition and number of layers in a cell wall, how its cellulose fibers are arranged in those layers, and how much space the cell wall takes up. For instance, the diameter of a coconut tree changes little over its lifetime, so the thickness of its cellular walls depends on where they are located along the stem; those at the base are thicker to give it more support. While engineers have designed a wide variety of novel materials, from soft elastomers to sturdy alloys, Gibson says so far they have not been able to fabricate cellular composites with the controlled precision of plants. Plant cells not only have mechanical functions but also must accommodate growth, and provide surface areas to capture sunlight and transport of fluids. Says Gibson: “With the development of nanotechnology, I think there is potential to develop multifunctional engineering materials inspired by plant microstructures.”– TG
Drivers and passengers are typically buffered from the routine noises a car makes as it drives along pavement. But when a tire hits a pothole, bump, or other unexpected obstacle, the subsequent blast comes through loud and clear. Guohua Sun, a University of Cincinnati engineering Ph.D. student, has developed an algorithm that greatly deadens noise made from unforeseen bumps in the road. The algorithm works fast to unleash an opposite-phased “mirror” wave of sound. As the sound wave from the road noise hits the mirror wave, each cancels the other out. The level of the road noise is reduced by 3 to 5 decibels, reducing its volume by 50 percent or more. Of course, if roads had fewer potholes, drivers would experience far fewer jolts to the eardrums. Canada’s Python Manufacturing has a pothole-filling machine that can be operated by one person, from the safety of the driver’s cab, using a joystick to control the truck’s tool arm. Within two minutes, the Python 5000 can clean, treat, and fill a hole with either hot or cold asphalt, then tamp it down with the force of a paving machine. The 5000 sells for $290,000, but Python says that in five years it could save a highway department 40 percent over standard methods. – TG
Factoid – 29.8% – The share of U.S. engineers engaged in international collaborations, based on a new analysis of 2006 data. The work ranged from research and development to supervision of people and projects, and teaching. Chemical engineers were the most collaborative at 43 percent. Source: National Center for Science and Engineering Statistics InfoBrief, August, 2012
This monthâ€™s Databytes draws inspiration from a recent debate on whether engineering schools are producing enough graduates for current job openings. We asked the question: Will engineering disciplines with higher median pay and greater predicted job growth show greater undergraduate enrollment growth than disciplines with lower median pay and lower predicted job growth? To answer this question, we looked at part-time and full-time engineering undergraduate students enrolled in 2006 and 2011 for selected engineering disciplines, and matched them with Department of Labor statistics on predicted job growth and median pay for engineering disciplines from 2010 to 2020.
An environmental researcher pokes holes in favored alternative-energy strategies.
There are skeptics, and then there are environmental contrarians like Ozzie Zehner, author of Green Illusions: The Dirty Secrets of Clean Energy and the Future of Environmentalism. Almost no popular alternative energy solution escapes his skewering. Electric vehicles (EVs) and hybrids? No better than the gasoline-powered variety. Solar and wind power? Overhyped “greenwash” and not all that clean, either. Hydrogen? A dead-end technology that keeps returning like a “zombie,” resurrected by special interests.
Such critiques might not be so unusual were Zehner a climate-change denier or conservative activist. But he’s an environmental researcher and visiting scholar at the University of California, Berkeley who plays for the green team, too. He serves on the editorial board of the online journal Critical Environmentalism, for instance. The difference? Zehner maintains that in their rush to tear apart their opponents’ half-truths, cherry-picked facts, pseudoscience, and outright lies, eco-champions often turn a blind eye to the equally bogus claims of alternative energy proponents. His book “looks at the unintended consequences of alternative energy technologies and how they have seduced us into overlooking durable, inexpensive, and ultimately more enjoyable solutions.” It also is an antidote, he says, to the way the media typically frame the problem: that the world faces an energy shortage and needs new ways to create even more energy.
Although Green Illusions takes aim at a host of technologies, reviewers have mainly focused on Zehner’s dismissal of EVs and hybrids. He maintains that battery and electric-motor manufacturing require so much energy, as well as toxic materials and minerals, that the end result is in no way a green solution. The EV lobby wasn’t amused. In a lengthy Wired article, EV proponents slammed Zehner’s research as “ridiculous” and “dubious.” Zehner remains unruffled, saying the industry is “clearly interested in protecting (its) turf.” But he does admit to being initially surprised that his whack at EVs — just a small part of his book — generated so much coverage. He has come to understand it as natural, given the emotional connection Americans have to cars. Pointing out eco-mobile limitations, Zehner says, is “almost as bad as calling somebody’s baby ugly.”
Green Illusions isn’t just a diatribe against environmental sacred cows, however. Zehner, who has a bachelor’s in engineering from Michigan’s Kettering University and graduate degrees in science and technology studies from the University of Amsterdam, also calls his book a “constructive” critique. He wrote it, he says, “to spur discussions about how alternative energy technologies can be more relevant in the future.” So Zehner also devotes many pages to what he contends are more sustainable solutions to pollution, greenhouse gas emissions, and climate change. Mainly, he argues, society would do better to greatly reduce global energy consumption, which would then make some alternative forms of energy more workable. Spend more on mass transit, Zehner suggests. Fix urban traffic flows to encourage more walking and bicycling. Though a fan of energy-saving architecture, he says today’s green-building designs often rely too heavily on just two technologies: solar panels and urban wind turbines.
Isn’t trying to put the planet on a stringent energy diet as unworkable and politically naive as is trying to curb the global love affair with cars? Perhaps, Zehner concedes. But at least he’s trying to start the discussion. “I certainly don’t have all of the answers, so I see that dialogue as necessary,” he says. Even if it means irritating a few environmentalists along the way.
Thomas K. Grose is Prism’s chief correspondent, based in London.
A surgeon finds a key to medical safety — and newfound respect for engineers.
On the strong recommendation of a trusted colleague, I recently read a book that proved to be fascinating on several levels. The book is The Checklist Manifesto: How to Get Things Right by the surgeon Atul Gawande, and it presents a convincing argument for the efficacy of operating-room teams going down a checklist before beginning surgery.
After noting that there are three times as many deaths attributed to complications from surgery as there are from highway accidents each year, Gawande chronicles his quest for ways to make the operating room safer. Among the places he looks to for ideas are the construction site and the airplane cockpit, both considered domains in which engineering plays a central role.
Building failures in the United States are extremely rare, Gawande notes, and he cites a 2003 Ohio State University study that found there have been only about 20 partial or full collapses per year in a population of over 100 million existing buildings. To find out how the industry achieves such a high success rate, he interviewed the structural engineer responsible for the design of a new wing being built for a Boston hospital.
Upon first meeting, the surgeon found the engineer to be other than what he had expected. Indeed, Gawande found he had “a cheery, take-your-time, how-about-some-coffee manner” rather than, we must assume, the gloomy, rushed, let’s-get-right-down-to-work style for which engineers might be known, at least among surgeons. The doctor liked the engineer’s desk-side manner. But this was really a business call, for the doctor wanted to know how structural engineers achieved such high rates of success in erecting skyscrapers and other buildings.
Talking later with the “project executive” for the new hospital wing, Gawande learned that the secret to construction success lay in the critical-path method of scheduling, which the doctor saw as a checklist. He saw that the amount of knowledge and degree of complexity the project executive had to manage “were as monstrous as anything [he] had encountered in medicine.” This gave him a newfound respect for engineers, from whose practices he thought surgeons could learn.
Gawande was also impressed by the system by which problems were resolved on the building site. In addition to the construction schedule, there was a “submittal schedule,” which effectively listed communication tasks to be completed. These tasks were necessary so that the many different groups involved in the project would be on the same page regarding when the project was ready to move to a new phase. The surgeon saw this as another form of checklist.
He also looked to the commercial airline industry, whose safety record is well known. He visited a person at Boeing responsible for developing checklists to cover everything from normal engine startup to such in-flight emergencies as a cargo door latch failure or sudden engine shutdown. The sequence of tasks to be performed under such circumstances also was deemed a checklist.
Convinced that checklists would bring improved safety to the operating room, Gawande and his team developed and tested them. After a six-month trial period, they found that the rate of major complications for surgery patients fell by 36 percent, and deaths declined by 47 percent. The lessons learned from engineering good practice had produced medical results greater than expected.
Gawande has become an ardent proponent of checklists, and his Checklist Manifesto is an eloquent and persuasive call for greater use of them in all areas of hospital treatment and care. His book is equally valuable for its recognition that engineering has a lot to offer other professions.
Henry Petroski is the Aleksandar S. Vesic Professor of Civil Engineering and a professor of history at Duke University. His latest books are An Engineer’s Alphabet: Gleanings from the Softer Side of a Profession (2011) and To Forgive Design: Understanding Failure (2012).
More no-nonsense engineers in Congress would make Washington work better.
Election season is here, the time when we decide who will occupy our political offices. Our nation has seen ongoing economic woes and tensions between socioeconomic classes, and these weigh heavily on voters’ minds. As a middle-class citizen and an engineering student, I have made my decision: I want to vote for the 1 percent.
The 1 percent can be effective public servants. They possess the ability to think critically, allowing them to solve problems for the 99 percent. They can analyze complex systems. Moreover, they are known for objectivity and systematic assessment of evidence before reaching conclusions. We need more leaders with the no-nonsense qualities of the 1 percent.
To be clear, I am not talking about the wealthiest 1 percent of Americans. Instead, I am referring to engineers, who currently represent about 1 percent of elected officials in the U.S. Congress but 10 percent of the American workforce.
Inviting engineers into politics may seem as bizarre an idea as letting bicycle mechanics manage your household affairs. Yet engineers have skills that are fundamentally relevant to politics. Like politicians, engineers operate in a realm where budgets must be planned and followed, and where reaching consensus means making trade-offs between multiple objectives. While engineers lack the law degrees and business expertise that many politicians possess, they are wired to solve problems and improve efficiency.
If more engineers held office, perhaps they could unfreeze our hopelessly stalemated political machinery. A recent Pew Research Center poll shows that Democrats and Republicans have become increasingly polarized over recent years. If engineers can identify opportunities for political compromise in the same way that they negotiate multiple constraints in technical design, then they might broker bipartisan agreements. Of course, this presumes engineers do not sheepishly follow a party line or engage in political shenanigans; they are humans after all.
We also need more engineers in office because several modern challenges are rooted in technical realities. These include the looming impacts of climate change, the drive for energy independence, nuclear security, and the infrastructure and resource issues related to global population growth. Politicians rely on committee specialists to analyze and distill technical information, but important considerations may be lost in translation or fall on deaf ears because of political bias. The intersection between technical knowledge and policy is vital for cohesive decision making. More elected engineers could help ensure a robust interface and expertise.
Despite their potential benefits as policymakers, I don’t expect to find many engineers on the November ballot. Yet many engineers are passionate about politics. Why do just 1 percent serve in Congress?
One reason is the culture of our profession. A political hiatus might as well be professional exile, since the engineer inevitably will “lose practice” designing buildings during his or her term, while the researcher will exit the paper-publishing race. This prospect scares engineers. If more engineers are to participate in politics, a culture shift is needed in industry, universities, and research agencies to accommodate political endeavors. Strangely, many institutions are unforgiving to employees with political aspirations, even though as officeholders those individuals might champion federal investments in infrastructure and research programs.
Public perception about the electability of engineers is another reason. With such a small sample size, there are no data to suggest the effectiveness of a Congress with more engineers. In countries like China, by contrast, it is not uncommon to have engineers and technocrats occupy the majority of top positions. A wave of courageous and charismatic engineers in office might shift perceptions and inspire other engineers to become active participants in the political process at all levels.
This grazes the surface of an admittedly complex proposition. While engineers are no silver bullet, having enough of the right engineers could help alleviate some political inefficiencies. My father often says that the world needs more engineers and fewer lawyers. Perhaps we should consider whether our political leadership needs more of the 1 percent.
Mark Raleigh is a doctoral candidate in civil and environmental engineering at the University of Washington.
Like other groups, Asian-Americans endure stressful stereotypes.
Asians and Asian-Americans occupy a unique position in engineering education. Though overrepresented relative to the U.S. population, they remain a minority among engineering students at most institutions. Do Asians and Asian-Americans face similar stereotyping and discrimination issues as other minority groups? Our long qualitative and quantitative study of engineering students from a variety of racial and ethnic backgrounds suggests they do. The results should serve notice that equity in STEM education is not ensured simply by proportional representation.
Launched in 2006, the study used critical race theory as a framework. The theory holds that race is used by society to benefit the powerful. We discovered that many of our Asian-American participants talked indirectly about the “model minority” racial stereotype and how it affected their daily life in engineering. This stereotype holds that students of Asian heritage are hardworking, intelligent (particularly in math and science), interested in economic prestige and educational attainment, and uninterested in racial-identity politics.
While some aspects of the model minority stereotype could be considered desirable qualities in engineering students, such as the presumption that someone is intelligent or hardworking, that thinking is problematic. Many participants saw the United States as a colorblind meritocracy, despite describing everyday experiences of being racially stereotyped and telling stories of blatant discrimination. For example, one participant perceived that Asian-American students were trying to get As while white students were just trying to pass. Another participant described being asked, “Do you eat dogs?” by a project group member during introductions. A third participant told of an Asian instructor who stated that he would be stricter in grading Asian students because Asian students study more.
The model minority stereotype is generally invisible to or ignored by both engineering students and faculty, including Asian-Americans. Moreover, engineering students have few venues to learn about the impact of race or how to articulate or respond to any negative consequences they might experience. Engineering courses generally do not address racial issues. The curricula are so full that students who are interested in racial issues have little opportunity to take relevant courses without delaying graduation. And, unlike other groups, there is no widely recognized national technical organization for Asian-American engineering students, although an educational and professional organization has recently been established.
The model minority stereotype is dangerous for Asian-American engineering students because it implies that they do not need the support offered to other minority populations. Affirmative action policies, for example, typically exclude Asian-Americans. Despite facing many of the same hurdles as other underrepresented minorities, such as attending poor urban schools, having families with no or little experience with the U.S. higher education system, or being of limited financial means, Asian-American students are assumed to have all the prerequisites for success in engineering. Like all stereotypes, the model minority label may result in excessive stress and diminished accomplishments.
To encourage greater equity, institutions should consider training faculty and staff to recognize stereotypes, including those applied to Asian-Americans, and to respond to all incidents of everyday racist behavior. That includes countering assumptions that Asians are not suitable for leadership positions or lack American citizenship. In addition, institutions should realize that the absence of discrimination complaints by minority students does not mean the institution is equitable. Finally, institutions should extend to Asian-Americans the same benefits provided to members of underrepresented groups.
Deborah A. Trytten and Susan E. Walden are researchers with the Research Institute for STEM Education (RISE) in the College of Engineering at the University of Oklahoma. Anna Wong Lowe is an instructor of communication at Oklahoma Baptist University. This article is excerpted from ‘“Asians are Good at Math. What an Awful Stereotype’: The Model Minority Stereotype’s Impact on Asian American Engineering Students” in the July, 2012 Journal of Engineering Education. The work was funded by National Science Foundation Grant DUE-0431642.
AT&T’s former unique status let genius flourish.
The Idea Factory: Bell Labs and the Great Age of American Innovation.
By Jon Gertner, Penguin Press 2012, 422 pages.
I t is easy to take for granted our 24/7 global connectivity and its progress from telegraph and telephone to fax, cell-phones and email and, finally, Internet-based social media and video. But it was on a grassy campus in Murray Hill, N.J., that many of these inventions were first conceived and developed; and this is the story Jon Gertner relates in The Idea Factory: Bell Labs and the Great Age of American Innovation.
At the height of its productivity, the Bell Telephone Laboratories employed some 1,200 science and engineering Ph.D.’s. The research unit formed one part of the massive three-pronged organization of the American Telephone and Telegraph Co., with the business side focused on telephone customers throughout the United States and the manufacturing subsidiary, Western Electric, producing everything from telephone cables to poles and switchboards. This trifurcated organization secured extensive funding for the labs, while AT&T’s monopoly status ensured its research clout. Indeed, from the 1930s through 1970s, Bell Labs became “the country’s intellectual utopia” and in the process laid the foundations for modern telecommunications.
When Gertner, an editor at Fast Company magazine, declares that Bell Labs was “for a long stretch of the 20th century…the most innovative scientific organization in the world,” the panegyric tone of the book becomes clear. Given the labs’ astounding record of achievement, however – including the creation of the first transistor and cell-phone, silicon solar cells, laser technology, radio astronomy, and the UNIX operating system, not to mention several Nobel Prize awards – most engineers may agree and thoroughly enjoy this heady narrative of scientific experimentation.
Focusing on six men strongly involved in shaping the labs, The Idea Factory provides insight into their lives, work, and eccentricities. Mervin Kelly, president from 1951 to 1959, played a pivotal role in establishing a tradition of wide-ranging research by recruiting top scientists and engineers, creating interdisciplinary working groups, and designing sleek, efficient buildings for Murray Hill. Equally significant, but far more difficult a personality, was physicist William Shockley. Fearing that his contributions to the transistor would be overshadowed by the team working under him, Shockley secretly raced to make his own improvements that, once unveiled, secured his claim as coinventor. His subsequent efforts to operate a semiconductor lab in Mountain Valley, Calif., were marred by further disputes with coworkers, yet his company and the several spinoffs it inspired formed the beginnings of Silicon Valley.
Perhaps the most unusual researcher was mathematician and electrical engineer Claude Shannon, credited with the origins of information theory, or IT. As someone who enjoyed building complex mechanical toys and pedaling through the office halls at night on a unicycle – sometimes while juggling – Shannon declared himself to be more interested in the elegance of a problem than its practical application. In 1948, his treatise “A Mathematical Theory of Communication” would prove both elegant and revolutionary in its proposed use of binary digits – strings of ones and zeros – for electronic transmissions.
These vivid profiles bring the workings of Bell Labs to life, as do discussions of several decades of collaboration with the U.S. government on cryptanalysis, spy satellites, and other covert projects. The association helped maintain AT&T’s monopoly, but pressure from the Justice Department eventually prevailed. In 1982 the company divested its local telephone branches, and inevitable downsizing followed. Today, the labs operate at one third their former size, jointly owned by French telecommunications company Alcatel-Lucent.
Casting his eye upon the myriad companies that have taken pages from the Bell Labs playbook, Gertner asks if it is possible to revive a truly visionary approach to innovation. Though a few promising models are identified, including the energy innovation hubs established by U.S. Secretary of Energy Steven Chu, the author and others who weigh in generally agree that the circumstances of the period, the country, and company will not be replicated. Ending as a paean to the historic achievement of Bell Labs, the book nonetheless dangles the question of how to tackle our present and future wicked problems.
Robin Tatu is Prism’s senior editorial consultant.
When we urge students to become P.E.’s, we should lead by example.
Engineering faculty members often stress to students the importance of a professional engineer’s license. As North Carolina State’s engineering college states on its website, “The P.E. license is the engineering profession’s highest standard of competence.” We also work hard to ensure that our programs receive continuing ABET accreditation, and certify to state licensing boards that our students have fulfilled the requirements for the Fundamentals in Engineering Exam, one of the prerequisites for a license in most jurisdictions.
Yet many faculty members themselves are not licensed. We believe that needs to change and that faculty licensure is an important step in raising the stature of the engineering profession. We therefore urge universities to set a goal of requiring a P.E. for all tenured faculty.
Some will argue that tenured and tenure-track faculty with doctorates already have very high credentials. This is undoubtedly true, but a license means something more. It means a practitioner can be trusted, as the NCEES Model Law puts it, to “safeguard life, health, and property and to promote the public welfare.” The need for this protection is made all too clear in the history of such spectacular and tragic failures as the 1919 Boston Molasses Flood, the 1940 Tacoma Narrows bridge collapse, the 1986 space shuttle Challenger disaster, the 2007 I-35 bridge collapse, and the 2010 Deepwater Horizon oil spill.
If we want our students to acquire both the theoretical and the practical knowledge to prevent such disasters in the future, don’t we owe it to them to set the same standard for ourselves? The National Society of Professional Engineers thinks we do: “NSPE recognizes the responsibility of engineering faculty to formulate curricula and to teach students to prepare them for the professional practice of engineering. To fulfill this responsibility as it relates to the public health, safety, and welfare, engineering faculty teaching advanced engineering subjects should be licensed professional engineers.”
Those of us who are licensed faculty tend to agree. As David Rockstraw, professor of chemical engineering at New Mexico State University, says: “Teaching the theory of ice skating and actually getting out on the rink are two different things. I earned my P.E. the old-fashioned way, by getting practical experience even after I became a faculty member.”
In the NCEES Model Law, the practice of engineering includes not only those aspects we readily associate with practice, such as planning, design, operation, investigation, and expert technical testimony, but also “teaching of advanced engineering subjects.” With the inclusion of teaching as practice, NCEES is in effect suggesting that jurisdictions should require licensure for engineering faculty. State statutes in Missouri, Alaska, Texas, and Wyoming contain similar language, indicating that faculty licensure is required if the laws are strictly enforced. The Wyoming Board of Registration apparently goes farther to “require that the dean of the College of Engineering and Applied Science at the University of Wyoming be a licensed P.E.”
Certain disciplines, such as software engineering, have not historically emphasized licensure, but that is changing. As Today’s Engineer reported early last year, “Nine states are moving legislation that will require licensure of software engineers, and it is expected that, eventually, every other U.S. state and territory will follow suit.”
What about those faculty who are mainly focused on research and development? To the extent that the R&D involves the practice of engineering (applied research and development, not the basic or fundamental research that is focused on advancing the science) and thereby can affect public health, safety, and welfare, the answer seems clear – they should be licensed or on the path to licensure. Ideally, academia should take the lead, regulate itself, and encourage all current and new faculty members to obtain licenses. But if schools themselves fail, it would be up to professional societies and NCEES to encourage states not only to require faculty licensure but, once enacted, enforce the law.
Raising the stature of engineering begins with winning the respect of the public. Americans should be assured that those preparing future engineers to be responsible for public safety are equipped to set an example. .
Rob Lang, P.E., is a former dean and professor of civil engineering at the University of Alaska Anchorage. Kirankumar Topudurti, P.E., is deputy director of the Engineer Research and Development Center-Construction Engineering Research Laboratory, U.S. Army Corps of Engineers and an adjunct faculty member at the Illinois Institute of Technology.