THE OPACITY OF AI AND THOSE BOEING PLANE CRASHES

Many people have been trying to warn of the dangers of artificial intelligence, my own small voice among them. And now, there is a strange article from Zero Hedge that draws attention to the dangers in connection with those Boeing 737 Max 8 airplanes, two of which have crashed - with fatalities - in recent months, prompting several airlines either to ground their fleets, or to cancel orders. Mr. E.G. and Ms. K.M. spotted this article, and as one might expect, it prompts some high octane speculation, or rather, perhaps a revisit to some of my earlier warnings about the increasing reliance on AI:

Is The Boeing 737 Max Crisis An Artificial Intelligence Event?

The problem, according to the article, is an "MCAS patch", a safety anti-stall program on the aircraft, and it's worth noting what this article says about it:

I think the problem is that the Boeing anti-stall patch MCAS is poorly configured for pilot use: it is not intuitive, and opaque in its consequences.

By the way of full disclosure, I have held my opinion since the first Lion Air crash in October, and ran it past a test pilot who, while not responsible for a single word here, did not argue against it. He suggested that MCAS characteristics should have been in a special directive and drawn to the attention of pilots.

And there's another problem:

Boeing had a problem with fitting larger and heavier engines to their tried and trusted 737 configuration, meaning that the engines had to be higher on the wing and a little forwards, and that made the 737 Max have different performance characteristics, which in turn led to the need for an anti-stall patch to be put into the control systems.

So the patch was put into the system. But then there's this at the end of the article:

After Lion Air I believed that pilots had been warned about the system, but had not paid sufficient attention to its admittedly complicated characteristics, but now it is claimed that the system was not in the training manual anyway. It was deemed a safety system that pilots did not need to know about.

This farrago has an unintended consequence, in that it may be a warning about artificial intelligence. Boeing may have rated the correction factor as too simple to merit human attention, something required mainly to correct a small difference in pitch characteristics unlikely to be encountered in most commercial flying, which is kept as smooth as possible for passenger comfort.

It would be terrible if an apparently small change in automated safety systems designed to avoid a stall turned out have given us a rogue plane, killing us to make us safe.

I don't know about you, but I find all this profoundly disturbing, because what is being alleged in the article is something like the following:

1) The MCAS "patch" was "too opaque" for humans; it was "not intuitive";

2) It was therefore kept out of the pilots' training manuals because of this; and

3) The cumulative effect of these decisions means pilots were essentially not in control of their aircraft, and the system itself may have crashed them

Most people are aware that I don't fly at all. I used to, but even then I was never comfortable doing so. These days, I just simply flatly refuse, and in part it's because of stories like this. And we've already seen similar stories with "self-driving cars". Automated semi-trucks are reportedly already on the road, though we've yet to hear of any accidents involving them or failures of their AI. But whether or not it's true that automated AI trucks are driving down the road, I can say that there's something else that has happened. Not too far from where I live, the local Burlington Northern Santa Fe railroad does run freight trains with no engineer in the cab of the locomotives. There's even a little sign to this effect at a prominent railroad crossing that I often cross, or rather, used to cross until the impact of the sign warning about the automated train finally sank in. Now I drive a block or two out of my way to cross over the tracks on a bridge.

As I've written before about this increasing distance between humans and the machine society, I'd like to employ the article's author's template to a completely different area that I've written about before in conjunction with artificial intelligence: computer trading. I've blogged about this phenomenon, and talked about it occasionally on Catherine Austin Fitts' Solari Report quarterly wrap ups. My concern there has always been that with computers doing most of the trading - often in mere fractions of a second - that commodities, equities, and securities markets are no longer genuinely reflective of human trading activity. Apply that template to currency speculation and trading and one has in my opinion a recipe for disaster. Here too, the author of the article's template of the lack of intuitive human characteristics, the opacity of the system, and so on, would seem to apply. The occasional "flash crashes" in markets are a testament of the fact that price no longer functions adequately as a measure by which to evaluate and make decisions, since that can literally crash or spike in a matter of mere second, and in this regard, I cannot help but wonder if the article's author is not on to something, namely, flash crashes of a very different, literal sort.

See you on the flip side...

Posted in

Joseph P. Farrell

Joseph P. Farrell has a doctorate in patristics from the University of Oxford, and pursues research in physics, alternative history and science, and "strange stuff". His book The Giza DeathStar, for which the Giza Community is named, was published in the spring of 2002, and was his first venture into "alternative history and science".

36 Comments

  1. Reno on March 17, 2019 at 12:47 am

    Driving my son’s new Accord to SC was my first experience with new “safety” tech like lane control assist and adaptive cruise control. The former nudges you back toward the lane you are leaving unless you signal. Now who signals on interstates with very light traffic especially when you are gaining on a tractor trailer. Here is where the adaptive cruise control poses a danger. Normally you would just pull around the slower vehicle, a very safe maneuver. However the adaptive cruise control slows the engine and a car behind me almost ran up on me as the driver intuitively expected me to speed up to pass the truck instead of slowing down. Bad too is if you pass a lot without signaling the warning light comes on saying you are driving poorly and to stop for coffee! So it is keeping a report card on you which someday will be fed to the insurance companies. These features make driving different so I Imagine how bad it must be on a jet. Are you told about this at the dealership? No. To be fair I believe these car systems can be turned off if you have poured through the manual.



  2. justawhoaman on March 15, 2019 at 12:39 pm

    Latest spin on the crash, from ZeroHedge: https://www.zerohedge.com/news/2019-03-15/jackscrew-found-ethiopian-crash-site-confirms-boeing-737-max-was-set-dive

    In the comment section, there was a discussion about what was actually found on the crash site which included the following:

    “The Ethiopian plane crash produced no fire or explosions of any kind. No parts have a single burn of any kind. No fuel ignited even though it was full of jet fuel. The site has NO indication of a fuel spill of any kind. The FAA and Boeing ‘investigators’ were not even wearing masks!

    No wiring was found. Not one single piece. 600 miles of wiring and not one piece was found.

    No tailsection was found. Not one piece. No call letters, no insignia, nothing.

    No wings were found. Not one single piece.

    No fuselage was found. Not one piece.

    No seats were found. None.

    No windows were found. None.

    No luggage was found. None. Though unburnt and unpackaged passports were found.

    The ONE small, old engine or APU shown will never be allowed verification. The engines vaporized even though they are made of titanium.

    There is no blood and no body parts. All humans were completely intact and placed into body bags though no independent verification will be allowed. Even though the entire plane and engines vaporized all bodies were completely intact with no verification allowed.”

    Sounds like a plan to take down Boeing… in more ways than one.



    • eliot on March 15, 2019 at 3:27 pm

      http://time.com/5552076/boeing-737-crash-trump-boeing/

      This may be one reason. The crash scene” really was pathetic. They even left the back hoe there that they dug the hole with.



    • Robert Barricklow on March 15, 2019 at 6:06 pm

      Ken Lay crash-site territory?
      Dead or Alive?

      There’s a lot of that these days:
      fake school shootings, fake news, fake healthcare, fake college degrees, fake food, fake championships[winner scripted, like in wrestling], fake elections, fake crashes?
      ______ [fill in the blank].



    • goshawks on March 15, 2019 at 6:38 pm

      justawhoaman, I have to agree that the few crash-site photographs I have seen were underwhelming. Even at a crash speed of nearly 400 mph and being in a dive, I expected more debris. Plus, you brought up a good point: The aircraft would have had a goodly amount of fuel on board, being just after takeoff. That fuel would have “whoomped” no matter the speed or angle of impact. In a diving impact, it would have “whoomped” right in the crater holding the debris. In the few photos I have seen, there is no fire damage. We may have to find a site with major photos to sort those aspects out…



  3. goshawks on March 14, 2019 at 11:26 pm

    Another good article, with many informed comments:
    https://arstechnica.com/information-technology/2019/03/faa-acting-chief-says-satellite-data-provided-link-between-boeing-crashes/

    Commentator ‘shelbystripes’ writes:
    “The pilots on the Lion Air flight immediately before the accident had to run through three different checklists*. One was to address an airspeed unreliable warning; one was to address an altimeter disagree (the two altimeter sensors mismatch) warning; and one was to address runaway trim. Each checklist is to respond to a different symptom of a problem, essentially. In a lot of cases, the solution you identify with the checklist will solve the root cause.

    But in this case, the symptoms were all actually due to an AoA sensor error, and nothing on the airspeed unreliable or altimeter disagree checklists would’ve corrected the AoA sensor data. Nothing could have, in the air; you’d need to replace the sensor. So, nothing on the first two checklists would’ve solved the runaway trim problem at all. Once they got to the third checklist, they managed to diagnose by activating the stabilizer trim cutouts; at which point, the uncommanded trim down stopped. The ‘treatment’ for the symptom is to just leave the auto trim cut out, and manually trim for the rest of the flight.

    They just had to reach that point in the third checklist, while flying the plane in the meantime.”
    .
    * “Each checklist is tailored to help address a specific warning or condition, by testing the different possible causes and providing instructions for what to do once the cause is identified.”

    (Note that the above crew had the problem occur in cruise, while the two crashed aircraft had the problem occur minutes after takeoff. The airport the Ethiopian flight took off from was at 7,631 feet above sea level, and the aircraft only reached 8,025 feet above sea level. So, that flight crew had only around 400 feet above ground level to guess or run the checklists…)

    Another pertinent comment by ‘normally butters’ :
    http://www.b737.org.uk/mcas.htm
    “The intended MCAS activation criteria are:
    Angle of Attack high
    Autopilot disengaged
    Flaps up
    Steep banked turn”

    (So, the two flights which crashed took off with normal flap settings, started climb-out, and commanded flaps-up as normal. As soon as the flaps retracted, the MCAS activated – likely due to erroneous AoA sensor readings – and pitched the nose down. For the Ethiopian flight, that occurred at 400 feet above ground level…)



    • goshawks on March 14, 2019 at 11:46 pm

      Plus, another comment from ‘shelbystripes’ on the prior-day Lion Air flight:

      “But here’s the thing: Nobody knew that the AoA sensor had failed. It turns out that in the 737MAX, an ‘AoA disagree’ warning is an optional add-on. Lion Air apparently didn’t have that optional component, so the pilots of the prior day’s flight didn’t know or report an AoA sensor failure. The plane did throw out airspeed and altitude mismatch warnings, and because the pilots reported those warnings, mechanics tested the pilot tubes and static air sensors. That article notes the Lion Air maintenance logs don’t indicate any testing of the AoA sensors.” [italics in original]



  4. goshawks on March 14, 2019 at 8:23 pm

    For what it is worth, Benjamin Fulford (yeah, I know) has a different idea on the causes of the crashes:
    https://kauilapele.wordpress.com/2019/03/14/full-article-benjamin-fulford-3-11-19-pope-francis-fired-as-13-bloodlines-gnostic-illuminati-negotiate-end-of-western-civil-war/#more-77261
    “The most visible recent move in this secret war was the downing on Sunday of an Ethiopian Boeing 737 aircraft. Mossad sources say the aircraft was remotely hijacked and its passengers, including 19 UN officials, were killed as part of an attempt to disrupt a UN conference aimed at ending poverty and environmental destruction in Africa.

    However, the remote hijacking of this aircraft is looking to have been a really stupid move by the Khazarians and is likely to lead to the bankruptcy of the Boeing Corporation. That is because it is the second time in four months that a Boeing aircraft has been crashed via remote control by the Khazarian mafia. China, Ethiopian Air (Africa’s largest airline), and the Cayman Islands have grounded all Boeing 737s, and other airlines and countries are sure to follow until the remote-control hijackers are removed from control of Boeing [airliners].”



  5. marcos toledo on March 14, 2019 at 7:50 pm

    Artificial intelligence is an oxymoron and convenient patsy for high tech mass murders to use it for their industrial scale murders.



    • Robert Barricklow on March 15, 2019 at 12:36 pm

      Marcos
      Its raison d’etre.
      Blame it on the AI
      [Bossa Nova]



  6. marcos toledo on March 14, 2019 at 7:27 pm

    I was log in and was posting a comment on this post. I had to log in to post this comment I rest my case on relying on AIs to run anything including websites.



  7. Roger on March 14, 2019 at 7:20 pm

    What if we all are being monitored by algorithms. What if those who are against certain UN agendas and who are found to be highly influential online are being given scores indicating they are a high level threat to global well being. What if once one reaches a certain threat level AI is programmed to mess with your car during heavy down pours on the interstate. What if at an even higher threat level it has been programmed to look for opportunities such as your flights; so long as no one deemed essential is onboard. What if you discover something that could threaten the big monopolies monopoly and are put on this list. What if this list gets expanded to include political opponents or people or people who are deemed useless. What if there is a secret agreement to do this without people or governments knowing about or involved in the decision making. What if this AI starts getting to obvious and people begin to wonder about this.



  8. Westcoaster on March 14, 2019 at 5:58 pm

    If the 4 dead pilots had been briefed and checked out on the “MAX” these crashes and the other deaths might not have occurred. This was Boeing’s marketing decision NOT to mandate that pilots would need to be certified on the “MAX” since it was just the 737 with new, more fuel-efficient engines. No additional pilot training required! Only it’s not, because these are larger engines, mounted in such a manner as to change the COG (Center of Gravity) of the plane. My understanding is with all control surfaces neutral, the plane would climb. So the MCAS software was developed to make the plane fly straight and level. Only it obviously didn’t work that way in actual use.



  9. eliot on March 14, 2019 at 4:43 pm

    https://mobile.twitter.com/MilspecP/status/1106264636494626816?p=v

    These are comments from an ex army pilot suggesting (strongly in my opinion) that this most recent crash did not happen. He thinks it was a false flag…. I think some UN people (with clinton foundation connections) were trying to “disappear”.



    • eliot on March 14, 2019 at 5:10 pm

      https://mobile.twitter.com/MindFuror/status/1105948836227239937

      This a link to information about the passengers connections to clinton foundation. I find this incident interesting for two reasons: 1 – UN people felt they were in enough danger to fake their own deaths and 2 – the deep state is piss ed off at Boeing enough to shut down an aircraft designed in 1969. (Deep state civil war?)



      • Robert Barricklow on March 14, 2019 at 5:20 pm

        Way to go eliot!
        That’s outside the box…
        fast-inside pitch into the Clinton Strike-Zone!
        Loved it!



  10. Deborah Warren on March 14, 2019 at 2:13 pm

    Transport Canada …. from midnight March 14 … a notice from my Westjet account
    “Transport Canada has this morning made the decision to ground all Boeing 737 MAX aircraft effective immediately and to close Canadian airspace to this aircraft series. We are fully complying with this decision and have taken the necessary steps to safely ground all 13 Boeing MAX aircraft currently in service.
    We’d like to reassure you that over 92 per cent of our fleet is operating as usual. While there may be some interruptions to our schedule as we mitigate the impact of this decision, you can book with confidence knowing that WestJet continues to fly throughout our network with the safety of guests and employees at the forefront.
    Regards,



  11. Miguel Oniga on March 14, 2019 at 12:06 pm

    This whole whatssap collectivism must go. Wanna talk to me? Call me up on the phone!



  12. Robert Barricklow on March 14, 2019 at 11:38 am

    I’ve keep drilling this point home about their mantra:
    The human is the problem
    The technology is the solution.
    Following privatization: an unaccountable & unelected infrastructure theme – who is accountable?
    Does the public even have a say?
    [posting as I read]
    Yep, there is no price discovery mechanism.
    For a minute there; at the end, I thought you were going to say I cannot hep but wonder if the author’s article is not… drum roll… AI itself.
    Oh! but, that’s next week? next month? next year?
    Or, is it just not being evenly distributed?



    • Robert Barricklow on March 14, 2019 at 11:46 am

      ..and who here wants to become another casualty of AI’s long & tortured learning curve – all for a cause that is ultimately; to serve an unelected, unaccountable pyramid-capped leadership for optimized profits & ultimate control?



  13. WalkingDead on March 14, 2019 at 9:49 am

    Allowing any machine the ability to choose who lives and who dies in any situation, be it an aircraft, automobile, train, truck, etc., which does so entirely by the numbers generated by its sensors, subject to failure, is extremely questionable at best. A human would intuitively make the correct choice based on empathy, emotion, knowledge, and experience. While this may fail occasionally due to human error, the machine would make the same choice every time based solely on its inputs.
    This was addressed in the movie “I Robot” when the robot chose to save the injured man over the uninjured child based strictly on the percentages of who would be most likely to survive. A human would have saved the child.
    Automated machinery has its uses, but allowing machines to choose who lives and who dies is not among them. The fact that they are attempting to put this on the battlefield scares me.



    • mpaff on March 14, 2019 at 12:06 pm

      Empathy, what separates the man from the machines… If an organism resembling a human does not exhibit empathy, what does that make them? Human sacrifice, paedophilia are methods, in my opinion, to determine empathetic capacity…



  14. LGL on March 14, 2019 at 8:47 am

    @JPFarrell:
    2 tweets by POTUS DJT regarding this accient:

    https://twitter.com/realDonaldTrump/status/1105468569800839169

    https://twitter.com/realDonaldTrump/status/1105471621672960000


    Airplanes are becoming far too complex to fly. Pilots are no longer needed, but rather computer scientists from MIT. I see it all the time in many products. Always seeking to go one unnecessary step further, when often old and simpler is far better. Split second decisions are….
    ….needed, and the complexity creates danger. All of this for great cost yet very little gain. I don’t know about you, but I don’t want Albert Einstein to be my pilot. I want great flying professionals that are allowed to easily and quickly take control of a plane!



    • mpaff on March 14, 2019 at 9:00 am

      Hear! Hear! Take Microsoft products for example, the more they improve them, the worse they get!



  15. goshawks on March 14, 2019 at 8:45 am

    (Modded. Caught by c0ck in “c0ckpit”. I feel so much safer now. One more time: )

    This is a good article on the latest 737 crash, with many, many informed comments:
    https://arstechnica.com/information-technology/2019/03/another-737-max-jet-crash-prompts-groundings-by-china-indonesia-ethiopia/

    The 737 is about as far from an AI-controlled aircraft as you can get. The 737 is a legacy aircraft. It was designed during cable-control and hydraulic-actuation days. The computer-controlled MCAS has been ‘grafted’ on top of the well-proven legacy equipment. It is in effect a fly-by-wire sub-system nestled within a classical system.

    All 737s have two ‘Angle of Attack’ (AoA) sensors and two Airspeed sensors, mounted on either side of the forward fuselage. In older-model 737s, this information was used only for c0ckpit instruments and input to the autopilot for cruise. When the 737MAX series came out, Boeing wanted them to have similar flight characteristics to the previous 737NG series. So, the MCAS was designed for two things: to ‘step in’ and adjust the pitching characteristics of the MAX to match those of the NG, and to provide automatic stall protection via a pitch-down response based on AoA and Airspeed inputs. So far, so good…

    The problem is that MCAS is making ‘judgments’ based on only one AoA sensor – the one on the Pilot’s side. The Co-Pilot’s AoA is not input to the MCAS. That makes the MCAS vulnerable to the failure of a single sensor. In the Lion Air crash, preliminary flight-recorder readouts show that the two AoA inputs to the flight-recorder were 20 degrees different. So, the likely case is that the MCAS – based on faulty input from only one AoA sensor – computed that the MAX was in a stall and repeatedly kept pitching the nose down. This led to the flight crew becoming confused about being in a (unreal) stall and not diagnosing the real cause (bad AoA data) before losing control of the aircraft. (It appears from preliminary radar data that the Ethiopia crash had the same ‘bobbing’ motion before diving-in.)

    Note that Boeing offers an optional indicator/warning light that tells the flight crew that the AoA sensors are registering different AoA. Once that situation is known, any automated system with AoA input can be manually shut-down. It appears that mainstream airlines are buying this option , but smaller operators are not. (In my humble opinion, the fact that this is optional at all is outrageous…)

    In my mind, any MCAS-like system requires at least triple-redundancy of the full input system (AoA sensor plus any circuit boards and wiring), plus the ability to automatically vote-out an errant pathway. I imagine Boeing will be forced to institute this redundancy – through airline pressure, if not accident boards.

    (I fully expect any future 737-replacement aircraft will be a modern fly-by-wire aircraft, probably building-on 787 advances. That includes a quadruple-redundancy input & voting system for dispatch reliability: one non-functional and you can still go. Then, we can worry about any AI ‘advances’ being added to that aircraft…)



    • MarcosW on March 14, 2019 at 2:29 pm

      As a pilot and systems engineer and manager of large corporate technology programs, I agree entirely with goshawks’ assessment that this is not an issue of AI. Rather, this is an issue of a lack of engineering ethics driven by profit/marketing motives. For example, I find it unconscionable that pre-sales flight testing was not done in such a manner as to reveal what should have been an expected anomaly. And if such was revealed, even more unconscionable that the product was released without appropriate documentation and training.

      Having said this, the same conciousness and lack of engineering ethics that produced this error will inevitabley produce even more havoc when applied to products incorporating AI. In my humble opinion there is ultimately no replacement for a well-trained human at the controls of a complex vehicle. The best use of AI is to augment, but not replace, human ability to respond to events that require decisions to be made that are outside the human systems’ ability to processs due to time constant limitations.



  16. goshawks on March 14, 2019 at 8:32 am

    This is a good article on the latest 737 crash, with many, many informed comments:
    https://arstechnica.com/information-technology/2019/03/another-737-max-jet-crash-prompts-groundings-by-china-indonesia-ethiopia/

    The 737 is about as far from an AI-controlled aircraft as you can get. The 737 is a legacy aircraft. It was designed during cable-control and hydraulic-actuation days. The computer-controlled MCAS has been ‘grafted’ on top of the well-proven legacy equipment. It is in effect a fly-by-wire sub-system nestled within a classical system.

    All 737s have two ‘Angle of Attack’ (AoA) sensors and two Airspeed sensors, mounted on either side of the forward fuselage. In older-model 737s, this information was used only for cockpit instruments and input to the autopilot for cruise. When the 737MAX series came out, Boeing wanted them to have similar flight characteristics to the previous 737NG series. So, the MCAS was designed for two things: to ‘step in’ and adjust the pitching characteristics of the MAX to match those of the NG, and to provide automatic stall protection via a pitch-down response based on AoA and Airspeed inputs. So far, so good…

    The problem is that MCAS is making ‘judgments’ based on only one AoA sensor – the one on the Pilot’s side. The Co-Pilot’s AoA is not input to the MCAS. That makes the MCAS vulnerable to the failure of a single sensor. In the Lion Air crash, preliminary flight-recorder readouts show that the two AoA inputs to the flight-recorder were 20 degrees different. So, the likely case is that the MCAS – based on faulty input from only one AoA sensor – computed that the MAX was in a stall and repeatedly kept pitching the nose down. This led to the flight crew becoming confused about being in a (unreal) stall and not diagnosing the real cause (bad AoA data) before losing control of the aircraft. (It appears from preliminary radar data that the Ethiopia crash had the same ‘bobbing’ motion before diving-in.)

    Note that Boeing offers an optional indicator/warning light that tells the flight crew that the AoA sensors are registering different AoA. Once that situation is known, any automated system with AoA input can be manually shut-down. It appears that mainstream airlines are buying this option , but smaller operators are not. (In my humble opinion, the fact that this is optional at all is outrageous…)

    In my mind, any MCAS-like system requires at least triple-redundancy of the full input system (AoA sensor plus any circuit boards and wiring), plus the ability to automatically vote-out an errant pathway. I imagine Boeing will be forced to institute this redundancy – through airline pressure, if not accident boards.

    (I fully expect any future 737-replacement aircraft will be a modern fly-by-wire aircraft, probably building-on 787 advances. That includes a quadruple-redundancy input & voting system for dispatch reliability: one non-functional and you can still go. Then, we can worry about any AI ‘advances’ being added to that aircraft…)



  17. OrigensChild on March 14, 2019 at 8:20 am

    Well, what can I say? I’ve been a software engineer. What used to be the common excuse for catastrophic failures in software was once blamed on a computer bug. I cannot tell you how many times companies have hid behind that mask when a customer was inconvenienced by a “glitch”–trying to hide the mistake behind the ubiquitous excuse of the “machine” rather than the person who programmed it. Now, it’s fashionable to hide the catastrophic failure behind the even more nebulous excuse for grave human error as artificial intelligence? With respect to Dr. Farrell’s hypothesis, to which I am highly sympathetic in other venues, I believe this “system” did NOT have to be complicated, and as such, probably was NOT AI. Any algorithm requiring measurements for command and control adjustments with such a limited scope need not be intelligent at all. The type of thinking required appears reactive, thus “hard-wired”. AI generally operates in a proactive mode. Frankly, I spell an attempt to absolve a corporation from liability–while privately investigating and sanctioning the programming department for a significant software error. Think about it. Though I agree with the overall thesis regarding AI at large, this is the “computer glitch” argument on steroids. Now, if the article is true and the system was AI, how much more liability SHOULD be heaped on the corporation responsible for its programming. Obviously the system was not well designed.



    • mpaff on March 14, 2019 at 8:57 am

      Always follow the money…!



  18. mpaff on March 14, 2019 at 8:12 am

    GIGO, “Garbage In, Garbage Out”. Computers are only as good as their programs. Programs written by human programmers. Now there are programs that modify themselves based on feedback, but those feedback parameters were still chosen by a programmer. I refuse to believe in a “Ghost in the Machine” interpretation of AI. So, if there is some malefic trend occuring in the computerized world, there is someone behind it. The collective hubris of the proponents of removing human control is astounding. The metaphor falls apart when you remove the common surface. Can you completely remove it? I think not. It brings to mind, though, the beginning of Yeat’s poem the Second Coming:
    “Turning and turning in the widening gyre
    The falcon cannot hear the falconer;
    Things fall apart; the centre cannot hold;
    Mere anarchy is loosed upon the world,
    The blood-dimmed tide is loosed, and everywhere
    The ceremony of innocence is drowned;
    The best lack all conviction, while the worst
    Are full of passionate intensity.”
    How prophetic…



    • OrigensChild on March 14, 2019 at 8:25 am

      mpaff, you and I are thinking alike here on this one. Color me skeptical to some degree regarding the “ghost in the machine idea”, I am giving room for remote hijacking an AI system, running on a quantum computer in a quantum state, by an external “agent”. This one sounds like a corporate public relations ploy to hide a bug in a routine beneath the cloth of AI–and that is an old, old tactic! I’m not buying this one at all!



      • mpaff on March 14, 2019 at 8:55 am

        I totally agree!



      • Robert Barricklow on March 14, 2019 at 11:54 am

        Perfect for industrial espionage; that may occasionally require a number of human sacrifices . However; it’s all for the good of the capitalist shareholders, who cash-in by winning these industrialized spy games.



        • Robert Barricklow on March 14, 2019 at 11:57 am

          Has Boeing had a number of “puts” recently?



  19. anakephalaiosis on March 14, 2019 at 5:37 am

    SQUARE POINT THEATER 31
    Patriarch bull
    defines brutal empathy
    in driver’s seat.



    • anakephalaiosis on March 14, 2019 at 5:38 am

      SQUARE POINT THEATER 32
      Oh lonesome me,
      singing for prairie cows
      with my banjo.



Help the Community Grow

Please understand a donation is a gift and does not confer membership or license to audiobooks. To become a paid member, visit member registration.

Upcoming Events