Newsroom Article

Artificial Intelligence and the Future of Behavioral Support Programming

Behavior support has come a long way, but it is quickly about to get exponentially better thanks to Artificial Intelligence (AI).

Posted on in Articles and Publications

Behavior support has come a long way, but it is quickly about to get exponentially better thanks to Artificial Intelligence (AI). Using data sources simply impossible for mere humans to identify and repetitive learning processes that escape our temporal attention spans, AI should be able to identify a problem behavior, and a solution to the behavior, well before any person.

One form of AI, so-called “adaptive learning,” is already familiar to educators in the form of self-pacing learning software. However, two other forms of AI should have significant impact on behavioral support.

With “unsupervised learning” technique, a computer is provided many examples but is not instructed what to do with those examples. Instead, by searching out common features, the computer discovers patterns or links in the data. This type of AI learning is particularly useful when looking for something without knowing what it looks like. The Economist’s Special Report on Artificial Intelligence (June 25, 2016), offers the example of looking for network traffic patterns for anomalies that might correspond to a cyber-attack. When unsupervised learning is applied to behavioral support data, AI may quickly discern patterns related to various antecedent conditions including behaviors associated with time of day, locations, presence, or absence of peers and staff and other ecological variables. AI may also be applied to school-wide or district-wide behavioral databases to assist in identifying factors and conditions relevant to school climate and prevention programming.

Another AI technique is “deep learning.” With this technique, the computer learns through repeated trials and examples, allowing the computer to “figure it out” on its own rather than following an explicit program. Such repetitive learning is the basis, for example, of voice recognition, familiar to anyone who has needed to bring voice dictation up to functionality for a student. According to the Special Report, other examples of deep learning include scrutinizing repeated examples and patterns to detect credit card fraud, and “read” x-rays and CT scans more accurately and quicker than humans do. It seems possible for deep learning to be applied to video file data to determine the parameters of proximity control of a teacher in a classroom or the relationships between the movement of students in school hallways and behavioral conflicts?

Discussing investor fascination with AI, the Special Report quotes Richard Socher, founder of an AI company called MetaMind, that AI “will be applied in pretty much every industry out there that has any kind of data – anything from genes to images to language.” One thing we know about modern schools, particularly the behavioral support component to a child’s educational program, is that we have lots and lots of data. The thing is, however, much of the data is not well analyzed, and even more goes uncollected. Comparatively, AI has potential for mining behavioral analysis data to monitor and to improve individual and school-wide positive behavior support.

In 2017, perhaps the most well known examples of wearable technology are smart watches. Continuing technology advances in wearable technology will include articles of clothing. Such advances in wearable technology have the potential to benefit to student behavioral support programs. Although presently not focused on behavioral support programming, “[s]portswear companies are competing to develop jerseys, shoes, and undergarments loaded with sensors and wireless circuitry. Firms’ ambitions range from the critical to the cuddly. Several companies are offering small gadgets that use GPS technology to track children who might wander off, for example. CuteCircuit, a British startup, has designed a smart shirt that reproduces the feeling of being hugged when someone sends the wearer a text message.” (http://www.economist.com/news/business/21646225-smartwatches-and-other-wearable-devices-become-mainstream-products-will-take-more). These efforts involve fabrics with increasing levels of comfort, (http://www.economist.com/news/technology-quarterly/21598328-conductive-fibres-lighter-aircraft-electric-knickers-flexible-filaments), a critical development as students needing behavioral support often have unique sensory preferences, including fabric feel preferences.

Such advances make foreseeable a future when AI can use data gathered from high-tech fabrics and other sensors to assist behavioral intervention planning. It would seem possible not only to gather frequency, duration, and intensity data (and without subjectivity), but also other data not currently available: body temperature, heart rate, oxygen saturation, body movement (both more visually obvious, such as running, but less obvious such as twitching and even eye movement); ambient environmental information, such as temperature (indeed any climate factor), light intensity (lumens), sound density or saturation (the amount of noise), sound volume (decibels), and proximity to others (infrared). Through deep learning and other predictive technologies, AI should be able to not only objectively correlate behavioral cause to effect, but ultimately to be able to timely predict a behavioral outcome as well as the recommended course of action to address the behavioral precursors.

We may know a child is set off by noise, but we struggle to know anything more than just that. AI could discern almost any noised-based factors, such as frequency, pitch, and tone, in addition to “it’s too loud.” As good as our human behaviorists are, the factors above seem beyond our human capabilities but well within those of our AI-based tools.

Now all we need is a grant and few computer scientist researchers. . . .

Clients who have questions regarding issues discussed in this article, or any education law matter, should feel free to call us at 215-345-9111.