The Defense Advanced Research Projects Agency, or DARPA, or as we like to refer to it here, following a suggestion of Mr. J.B., the Diabolically Apocalyptic Research Projects Agency, has a plan to "reinvent" electronics, and specifically, the computer chip, and even to reinvent the way chip innovations are done. Mr. R.M. shared this story by Martin Giles from the MIT Technology Review, and there's something in it that caught my eye, particularly in view of yesterday's end-of-the-twig high octane speculation about steam locomotives and Union Pacific's "heritage fleet" of steam engines.

Here's the story:

DARPA has an ambitious $1.5 billion plan to reinvent electronics

There's the usual DARPA boilerplate about not only engineering, but engineering how engineering is done:

One project aims to radically reduce the time it takes to create a new chip design, from years or months to just a day, by automating the process with machine learning and other tools so that even relatively inexperienced users can create high-quality designs.

“No one yet knows how to get a new chip design completed in 24 hours safely without human intervention,” says Andrew Kahng of the University of California, San Diego, who’s leading one of the teams involved. “This is a fundamentally new approach we’re developing.”

“We’re trying to engineer the craft brewing revolution in electronics,” says William Chappell, the head of the DARPA office that manages the ERI program. The agency hopes that the automated design tools will inspire smaller companies without the resources of giant chip makers, just as specialized brewers in the US have innovated alongside the beer industry’s giants.

But then there was something that caught my eye, implying that DARPA wants an entirely new type of chip architecture:

Another ERI project will explore ways in which novel circuit integration schemes can eliminate, or at least greatly reduce, the need to shift data around. The ultimate goal is to effectively embed computing power in memory, which could lead to dramatic increases in performance.

On the chip architecture front, DARPA wants to create hardware and software that can be reconfigured in real time to handle more general tasks or specialized ones such as specific artificial-intelligence applications. Today, multiple chips are needed, driving up complexity and cost.

Some of DARPA’s efforts overlap with areas already being worked on extensively in industry. An example is a project to develop 3-D system-on-chip technology, which aims to extend Moore’s Law by using new materials such as carbon nanotubes, and smarter ways of stacking and partitioning electronic circuits. Chappell acknowledges the overlap, but he says the agency’s own work is “probably the biggest effort to make [the approach] real.” (Emphasis added)

Yesterday, you'll recall, I indulged in some really high octane speculations about typewriters and the ultimate in analog locomotive technology, the steam engine. My concern was couched in terms of "how would one haul long distance heavy freight in the event of an EMP attack on North America?" The answer, obviously, wasn't diesel-electric locomotives, but steam locomotives. The fly in that speculative ointment was not in the locomotive end of the spectrum, but in the railroad signalling and switching end, since most of this is now (and has long been) automated and therefore electrical.

So what's the relevance here? Methods of "hardening" electrical and digital systems have long been of concern to the military and therefore to DARPA. The internet, for example, in the form of the old ARPANET, emerged out of concerns about maintaining command and control of the country's military forces, and particularly its nuclear and thermonuclear forces, in the event of just such a "take-down." Decentralization of computer networks, burying cables, and hardening systems against the effects of EMP were the result, and many of these were and still are classified. But the point is that most of those systems, as far as I am aware, utilize methods that are not "built in" to the circuits they are trying to protect. To draw an analogy which might make this point clearer, cables are still buried in an effort to protect them, but the cables' circuitry themselves are not altered as a means of such hardening except in a case mentioned below.

But what if one could "harden" digital systems against such EMP attack, or other systemic breakdown, not by materials science alone, but by circuit architecture and in combination with materials science? What if a way could be found to distinguish between the "noise" of an EMP attack or other electromagnetic transient, and the actual "work" being performed by a chip or chips, and thus discerning the two, redirect the energy of the transient into the ground, or, in the ultimate electronic jiu jitsu, simply capture and store that energy in capacitors? In other words, what if one discovered a way to "partition electrical circuits" to do just what I've suggested? This would, in effect, make current "hardening methods" all but obsolete, and provide a measure of security to communications and industry it currently does not have, not to mention be a huge new industry. What if computer architecture and digital systems in general could be designed to respond to such attacks by more or less instantaneous modifications to its architecture to avoid the nasty results of such attacks? Nor need one think solely about the usefulness of such a capability simply in terms of deliberate EMP attacks. Other transients might be avoided, for example, the phenomenon of exploding transformers on power grids during electrical storms and lightning strikes. To a certain extent, the power grid already has safeguards against such things, and that occurs in its architecture, which, as we know, occasionally doesn't work.

In either case, I strongly suspect that behind DARPA's latest mission about doing an end run around Moore's Law, that much more is at stake, and much more is not being said. As usual, it's being said between the lines. It's that "electrical partitions" and "self-evolving circuit architecture" that the article implies, which indicates much more is afoot than meets the eye.

See you on the flip side...


  1. . . . Pondering what DARPA wants (probably their version of AI throwing a digital tantrum), somewhere in one’s cranial cavity amongst the convolutions “FOXCONN” made an appearance. . . A question followed, “How efficient and adaptive will their engineered production methods be for not just liquid crystal display (LCD) units, but for on the fly re-tooling assignments for the fabrication or manufacturing of other precision components – for say – “stacking and partitioning electronic circuits?””. . Instead of circuit-board-sprawl of multiple high precision chips one could quantify that capacity into one fingertip sized cube, or tetrahedron, or sphere like structure, or a combination of all three and more, for example, with a very high degree of reliability in moving electrons and other energy portions. . .

    . . . One would seem to be boldly going. . . In some ways, Foxconn has its sights on artificial intelligence (AI) and other robotics projects. . . Like Chrysler with its car and truck production during the World War II era, also taking on manufacturing that high precision instrument gyro-compass, or earlier last century, Remington, not only producing typewriters, but also firearms and parts most likely for another war effort. . . Foxconn has the LCD units and ________, (fill in the blank sort of thing likely still in the planning stages). . . Seems reasonable to expect to be able to expand one’s outlook while having a primary task. . . DARPA, reinventing electronics to account for quantum computing techniques and other bold production capacities also seems reasonable as industrial security from industrial espionage (overseas folks already have enough stolen secrets) would be an item of interest to national security and, especially, for the US private sector research and development. . .

    . . . “Position,” is an all-important frame of reference when considering how to be somewhere strategically and reliably (possibly, nearly instantaneously even virtually) as well as able to navigate a quantum environment and while there to complete a designated task. . . It seems that engineering such a tasking might also require the engineer to be there, if only virtually, not just directing robots. . . One question goes begging, “How can that be done from one’s human frame of reference using inept decades and century’s old thinking habits?”. . It does not seem possible. . .

    . . . Position, “X” marking a point or space apparent, is an important consideration across-the-board of realization, too, be it in targeting, denying targeting sources, moving energy portions efficiently and accurately, imparting matter presence for a given position (say, information storage (data)), measuring 3-D designations or places, as well as for just reliably navigating space that is apparent. . . Well positioned for transportation, work force support, resource availability, and communications, to name a few things are also practical matters for being in the right position of commerce. . .

    . . . Doing things differently, is a better approach it seems rather than using the same old junk laws and methods that have already reached limits. . . That invariably means new math and, especially, a new lexicon for purpose instead of trying to shoe-horn an old jargonized rant onto a publicly presented paper to get something to sound presumably new, but instead confused with the old talk. . . Leibniz and Newton did just that with what became Calculus. . . The old math did not measure up for purpose in the day for calculating changes in motion – The new math, Calculus did, and now it needs work, too, to accommodate new sensory experiences of observation and motion as well as re-evaluate what was thought to have been observed last century and centuries before. . . The old lexicon definitely needs work toward better expression to fit bold steps forward. . . Most of these areas are new territory for younger folks as well as for older folks who dare to learn one more new trick. . .

    . . . Using an octane level of speculation more than one is accustomed to using anyway, one has arrived at the following assumption a while back: Defense Advanced Research Projects Agency (DARPA) and the elder mentality of the Army’s Research and Development are too close to distinguish, but are formidable military sources for national defense. . . In addition, of course, your version of DARPA (Diabolically Apocalyptic Research Projects Agency) since they sometimes seem to act that way when stupidity ensues from poor decisions made that are entangled with excessive greed and ambitions. . .

    . . . In one’s humble opinion, one should not use old dogmas of science, to include the life sciences – the investigative material and ethereal history’s, as if those laws and rules cannot be rescinded, recycled, or replaced. . . It is not only ridiculous, but dangerously stagnating, especially, for a profit margin. . . There’s a great deal not accounted for or explained. . . That’s not to say barter and capitalism have not had their successes in exchanging ideas and end products, but they’ve also had their failures, thefts, and unintended displacements and consequences. . . The presumption that another’s technology should be back engineered and re-constructed to fit like a glove for another species is even worse, especially, since it was acquired through happenstance. . . Those that it once belonged and might still belong might not be so humbling an acquaintance and possibly, quite formidable in the most subtle and insidious of ways. . . Suffice it to say, noting another technology, even if not yet explained, is more than enough sensory input to know that not all things are the same because some gifted orator said so long ago, to clumsily paraphrase ‘my way or the highway,’ and acquired a sycophantic following that boarded that band wagon now figuratively headed off the cliff. . .

    . . . That this presumptively mystifying entity called DARPA wants to reinvent electronics, . . . fine, . . . but those folks will need to pony-up tangible returns in real time, not duplicate the scantily laden lies from NASA, previous DARPA, enlarged university associated cadres, and the agencies and government boasting how hallowed their halls are. . . There have been to many incidental failures and they need to be corrected not financially endowed to spoof. . .

  2. While slowly re-reading that DARPA article regarding automating chip designs, a couple of things came to mind.

    First , this remark :

    “No one yet knows how to get a new chip design completed in 24 hours safely without human intervention,” says Andrew Kahng .

    The implication is that new chip designs can be Completed in 24 hours WITH human intervention. So, where would there be a need for reliable automated chip designs on demand ?
    Deep Space exploration comes to mind (and the WARP capable in 100-year thing ?)
    The quadrupling of the budget indicates the acceleration of the initial roll-out schedule.

    A second thing that caught my attention is this passage :

    “The agency hopes that the automated design tools will inspire smaller companies without the resources of giant chip makers, just as specialized brewers in the US have innovated alongside the beer industry’s giants.”

    I immediately thought of Distributed Manufacturing and Augmented Manufacturing, aka 3D Printing.

    A third passage brought to mind visions of ubiquitous distributed architectural computing model, more like a textile-like computing layer made of neural network elements.

    “new materials, and new ways of integrating computing power and memory, will be needed. Shifting data between memory components that store it and processors that act on it sucks up energy and creates one of the biggest hurdles to boosting processing power.
    Another ERI project will explore ways in which novel circuit integration schemes can eliminate, or at least greatly reduce, the need to shift data around. The ultimate goal is to effectively embed computing power in memory, which could lead to dramatic increases in performance.”

    This implies they’ve solved the memory lag problem an they’ve also solved the energy supply AND dissipation constraints.
    Now they need to roll those solutions out.

    Next :

    DARPA wants to create hardware and software that can be reconfigured in real time to handle more general tasks or specialized ones.

    Optical computing with programmable diffraction gratings (made up, one presumes, of non-linear meta materials) might be an immediate way to implement real time computing element configuration. (Think programmable analog computers.)

    Now add in the Space Force, sprinkle in a little distributed Quantum Computing (in particular , the work of Michelle Simmons and and her team on quantum computing in Silicon . See links below), one might begin to perceive their real road map: Retooling Up the domestic manufacturing base for Space ‘Exploration’ on Steroid ; a space based ECONOMY ?


    All of those seem be way points toward the Star Trek replicator for Jordi to repair and create new stuff on demand.

    I think.


  3. My first thought was of aircraft control systems. Back when fighters were intentionally being designed to be unstable (for maneuverability), much thought was given to how much ‘redundancy’ was needed for such systems. It has pretty much stabilized (sorry) at four or five independently-running computer systems, with a ‘voting system’ automatically discarding an errant system. (When you got down to two or three systems left, ‘old’ fighters reverted to some kind of mechanical backup. Newer fighters have NO mechanical backup…)

    Applying the above to DARPA and chips, I could see an independently-voting array of ‘sub-computers’ on a chip (ideally, each of somewhat different design) to cope with one or more being taken-out by an EMP or such. Each chip sub-computer would have a test algorithm running continuously in the background. Changes? Voted out. On another line of thinking, additional on-chip sub-computers could be kept off-line or ‘insulated’ until some Event caused them to be activated. Different forms of redundancy…

    On “automating the process with machine learning and other tools,” this obviously raises the stakes if SkyNet ever gets loose…

    (By the way, it looks like two blogs were switched. The present URL reads:

  4. 24hrs, huh?

    If I recall, it was Dr. Farrell who suggested that 3D printing was black budget tech drip fed to the public. I think that they occasionally release things like this into the wild (free market) and essentially crowd source R&D. To me 24hrs seems to be a feat for tech such as 3D printing. Perhaps some silicon valley or a startup ‘think tank” made a breakthrough in printable circuitry, and the overlords scooped it up with the American tax payers footing the bill, via the recent defense budget increase?

  5. Digital circuits designed to work at 1.5 volts do not handle the voltage spikes delivered by static electricity built up in the human body very well. Asking for them to handle the voltage spikes generated by lightning strikes or an EMP is asking a great deal. Asking them to do so instantaneously is asking even more. You’re talking about nano-meter gaps handling a voltage differential of possibly millions of volts. I’m not certain if this is possible with the current physics. Maybe they know something we don’t.

      1. Sounds like a great idea for power quality and reducing power losses. Does nothing for the above though. I’m not certain even this technology could handle a lightning strike or EMP; and your still left with too much voltage across too small a gap inside CPU’s and integrated circuits.

      1. Material science is still in its infancy. The possibilities are infinite. Quite possibly something may have dropped into their laps via a crashed craft. The universe is a big place with endless possibilities.

Comments are closed.