Skip navigation

Automated Intelligence With Autonomous Interactions

Our industry is on a journey towards Automated Intelligence and that, for now, is the AI we are talking about, along with its interlocking AI, Autonomous Interaction.

The term AI—Artificial Intelligence—seems to be troweled on top of everything these days. But I feel we are a long way from its true definition.

Our industry is on a journey towards Automated Intelligence and that, for now, is the AI we are talking about. The interlocking AI, Autonomous Interaction, occurs once we begin Automating Intelligence. It is early days yet, but we need to learn how to walk before we can allow true AI to run our buildings.

Automation is basically making a hardware/software that is capable of doing things automatically. AI is all about trying to make machines or software mimic human behavior and intelligence. Automation can either be based on Artificial Intelligence, or not. 
We are still struggling with self-learning, with machine vision, and with voice interactions which all need to be solid before we can enter the true world of AI. I love this definition: AI is whatever hasn't been done yet.

We are now in the exciting era of Automated Intelligence because we are doing what can be done with today's tools while leaving what hasn't been done for some future definition of Artificial Intelligence.

I feel our journey is close to that of the autonomous car, a vehicle that is capable of sensing its environment and moving with little or no human input. I believe that we are striving to achieve Autonomous Interactions in our Buildings. This interaction is a part of what I am suggesting might be "Building Emotion," automating through autonomous interaction with the inhabitants of our buildings. We have much to learn from Automotive industries reinvention.

I asked Sudha Jamthe of IoTDisruptions advice and she provided these comments:

1. AI today is all about building the technology to capture what we do as humans and trying to automate it using machine learning to build models based on training data of past behaviours. Another industry term for this I see is "Augmenting Intelligence." It is about using AI to help us do our work better instead of replacing it. e.g. make decisions faster with a huge volume of data.

2. I like your point about the comparison with the journey of Autonomous Vehicles. I see autonomous vehicles as setting the vision for how vehicles will drive themselves and free us of the risks of traffic deaths and the cost of owning cars. But getting there is a journey which is more exciting for me because the car is not going to get there alone. The city infrastructure, buildings, parking, energy sources all have to become smart to interact with the autonomous car. That journey is going to create a connected world which I call the Driverless World. Read Sudha March article on AI.

In this video, Jamthe talks with Rebecca Wolkoff, Engineer from Enel X about real-time Energy storage for SmartBuildings.

Over a year ago I wrote this editorial, Building Brains - AI is the Brain, IoT the Body speaking with the Face and Voice of Digital Transformation.

Machine learning or Artificial intelligence "AI" is poised to take our industry quickly, in directions never before conceived. "Building Brains" with its' IoT body, face and voice are changing our world faster than we can imagine; actually, digitally transforming it.

Building Brains will evolve rapidly as more and more intelligence devices come with their own machine learning and AI. The challenge will be to manage these bit brains on the edge and use that intelligence for the greater good of the building, the campus, the city and the world. The new task is to build building brains with the myriad of edge brains creating a virtual intelligence digital transformation community for our purpose. In the past machine learning and AI only resided in the cloud but with the development of open source learning, voice platforms, and low-cost processor/memory at the edge, bit brains are now everywhere.

This article, Cutting Through the Hype Surrounding Artificial Intelligence in Smart Buildings from James McHale, Manging Director, Memoori, has some important insights to share:

The talk of AI might always sound like the distant future but the technology, in different forms, is already around us.

While there is still some debate, leading experts generally divide their anticipated evolution of AI into three generations, defined by the limits of their capabilities. The technology we experience in our smartphones, tablets, automated building systems or other platforms, is restricted to Artificial Narrow Intelligence (ANI). In the future, we should expect more advanced forms when we reach Artificial General Intelligence (AGI), and then Artificial Super Intelligence (ASI). There is still a lot of benefits to be gained from ANI and, as is the nature of AI, even narrow systems learn over time to become more and more “intelligent.”

In a world awash with data, AI can help to find “the signals in the noise” by identifying anomalies and patterns, then drawing out actionable insights. AI’s advantage over human intelligence is that it can process huge volumes of data that a person or team could not feasibly analyze in a reasonable timeframe. By using historical patterns to predict future data quality outcomes, businesses can also dynamically detect anomalies that might otherwise have gone unnoticed or might only have been found much later through manual intervention.

This article from the UCLA Smart Grid Energy Research Center again draws upon the self-driving car comparison:

So they will need to look like them and share similar attributes - While there is rapid innovation in automotive technologies, the infrastructure to support the above vision of sustainable transportation including automatic traffic light sensing by the vehicle, instantaneous traffic data, Vehicle to Infrastructure (V2I) Communications, Vehicle to Vehicle (V2V) communications, managing and delivering electric power wirelessly to EVs (including shuttles and people movers) while in motion, managing traffic dynamically based on real-time and historical data, detecting and avoiding pedestrians, and others, are in their early stages at best. This makes it challenging for vehicle manufacturers to innovate towards fully autonomous and all-electric vehicles beyond a point. Innovations in the areas of communications, sensors, GPS, software, cloud computing, controls, energy storage, power management, battery technology, wireless EV charging during operation, cybersecurity, big data, AI, Machine Learning (ML), Data Science and Block Chains are paving the way to create the smart urban transportation infrastructure that would enable the above vision of a modern sustainable transportation future.

This article from includes coverage from the most recent Consumer Electronics Show, and delivers an interesting message about the far reaches of Autonomous Interactions, that Google Home's Assistant could one day know your mood. Take that, Alexa:

In the next five years, Huffman suggests, the Assistant could achieve the basics of natural human conversation, which, from a computer science standpoint, are anything but basic. He says wake words like "Hey" or "OK" are "really weird." He wants the Assistant to understand your mood and tone, and detect if you're frustrated. He wants the software to remember an exact discussion you had with it yesterday so that today you can pick up where you left off. 

I ask him about the vision for 10 years from now. Maybe, he muses, physical robots -- not just bots you can talk to, but robots that move and do stuff -- will become household products, and digital assistants could integrate with them. 

Now that is the Building Emotion I am talking about!

Toby Ruckert, CEO of UIB—and IoT messaging company—makes the case that conversation will one day become the lynchpin of user experience in this article, New ways to interface Messaging as a Platform:  

The State of Human to Machine Communications Conversational user experiences, in the form of chatbots and voice interfaces, are overtaking many of the traditional ways in which we interact with machines. Since the rise of computers, human-machine interfaces typically had some form of Graphical User Interface (GUI) which enabled direct (if limited) interaction with devices and their programs, for instance via software installs, mobile apps, and web-based applications such as Software as a Service (SaaS). No matter how “beautiful” the respective interface, this GUI is now more and more replaced by a Conversational User Interface (CUI)…

These CUIs come in many shapes: chatbots on popular messaging apps such as WhatsApp, WeChat, Telegram, and iMessage, and, more recently, voice-activated devices and personal assistants such as Alexa, Siri, and the Google Assistant.

The IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems is launching the second version of Ethically Aligned Design: A Vision for Prioritizing Human Well-being with Autonomous and Intelligent Systems, (EADv2), a crowd-sourced global treatise regarding the Ethics of Autonomous and Intelligent Systems. Their site makes a very good resource for our journey to Socially Acceptable Autonomous Buildings. 

In fact, the IEEE Standards Association has even developed a model process engineers and technologists can follow for addressing ethical concerns during system design:

Throughout the various stages of system initiation, analysis and design. Expected process requirements include management and engineering view of new IT product development, computer ethics and IT system design, value-sensitive design, and, stakeholder involvement in ethical IT system design.

Without smart buildings, a truly smart city can’t exist, says Dave Hollander, Bluetooth Sig  writing on

As self-contained structures, smart buildings address the need for automation, control and monitoring, while serving as the building blocks of a scalable foundation towards the larger smart city. Without smart buildings, a truly smart city can’t exist.

This article from discusses “Smart Buildings as a Service” – sometimes called “servitisation”: 

Data from these smart building systems give a facility’s infrastructure a brain and a voice. This data is put to work through smart controls for buildings – whether in the public sector or commercial – which give buildings a “central nervous system” that balances and reconciles competing interests such as energy minimisation, occupant comfort and grid stability. 

Marc Petock, Chief Marketing and Communications Officer for Lynxspring, Inc. in this article, Is 2019 the Year of Truth for the Build Environment? seems to believe we have reached a tipping point:

2019 is the year we get smarter about smart technology, smart equipment and smart solutions!!

The built environment has been changing dramatically over the past 5 years. Technology shifts along with changing value propositions are making an impact on how we operate and manage our smart buildings. Our dialogue is no longer about the potential—rather it is about the reality of economics through driving operational value and business outcomes. The business of smart, connected commercial buildings and facilities have moved beyond the unknown and into the new era of real and relevant. 2019 will surely be another transformative year, not only in terms of technology but also in a fundamental shift in the conversation driven by digital transformation, connected data and new mindsets.

On closing this chapter please read this History Lesson but exchange the word "web-convergence" with next level "Automated Intelligence and Autonomous Interactions." Good cues for the future from out of our past. Looking back on (while looking forward from) 2002, our industry created a stable of " Awful Mated Buildings " I fear this could occur again:

The large building automation industry understood early the concept of web convergence but I do not believe that we envisioned ourselves as having such an active part in it. Originally we perceived that we would take our products and services to a web level and magically our services would become part of the clients web-based enterprise. It is now becoming clear that the companies that are most likely to succeed are developing the "magic" to take our automation interfaces to the next level in an elegant convergence model that adds tremendous value to our clients' enterprises.

My Call to Action - Our industry has a limited window of opportunity to control the successful convergence to our web-based clients. We have the knowledge to make these transitions work better than anyone. We know that we must significantly increase our knowledge of web-based ways and add "Web Heads" to our corporate structures. We can hire or create within but our focus for the next few years must shift from hardware integration to successful web integration and convergence. It isn't that the evolving hardware integration standards aren't important, it is an issue of if we do not complete our connection to the web world we run the risk of losing that large market in the future. As the cost of hardware drops and engineering of DDC systems becomes automated our industry will shrink if we do not move quickly to the next level. There is much work to be done at this level and this is likely the closest we will ever get to the ultimate interface with the end users of our building automation systems. The importance of this and the speed at which we move is cardinal. Our industry is about providing client comfort services. Our clients' information flow model is becoming web-based and the transition to web-based solutions is not only desirable it is becoming mandatory.

As an industry AI, no matter how we define it, is the next big thing. We need to integrate into our services and products. We need to envision ourselves as the intelligence in AI. 


TAGS: Technology
Hide comments


  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.