How AI Could Have Warned Us about the Florida Condo Collapse Before It Happened

A human can’t sit and stare at a wall all day and night. A computer can.

Ian Kelk
Towards Data Science

--

The partial collapse of a 12-story beachfront condominium in Surfside, Florida was a shock to the nation. With 98 people confirmed dead, it is one of the deadliest structural collapses in American history. The root cause of the structural failure has yet to be determined, but from initial reports it does seem that this tragedy could have been prevented.

Left photo showing the aftermath of the Surfside condominium building collapse which fell on June 24, 2021. Right photo showing detection of “concrete cancer” by an AI detector. Left photo by Miami-Dade Fire Rescue Department and is Public Domain. Right photo by Achim Hering on Wikimedia under the Creative Commons Attribution

Imagine if the security cameras that already exist in many high-rise parking garages could also report damage in the concrete of the building itself

After working for the past few weeks as an Upwork contractor for an AI startup called Clarifai, I realized I could use their platform to create such a thing. Camera systems already spend 24 hours a day filming public areas––including the concrete parking garages. The below video shows a full demonstration, including a walkthrough of the garage of Champlain Towers South a year before the disaster.

An important thing to note: While the below demo uses video taken by a person while walking (it’s the only footage we have of the Champlain Towers garage prior to the collapse), the AI is designed to work with stationary cameras mounted on the walls. It would expand the role of security cameras to not only record the actions of humans in their field of view, but to also report the conditions of the walls themselves.

Video essay explaining the backstory, development, and demo of the model discussed in this post. Model demo begins at the 2:43 mark.

A little bit of background

Living in a tropical climate like Miami Beach has certain complications, specifically flooding during the summer months. The photo and video below show some of the more extreme examples, but this happens between 3–4 times a season. It’s all too common to see naive drivers attempt to cross flooded roads, water getting sucked into their intakes and destroying their engine blocks. Heavy rainfall is usually accompanied by a few stranded vehicles, and occasionally hurricanes push even more water onto the area through storm surge.

A photo I took of Flooding from a summer rainfall during July of 2018. This was taken on the 6900 block of Collins ave, 8700, 18 blocks south of where the Champlain Towers South was situated. Photo by the author.

While the cause of the Champlain Towers disaster has yet to be determined, it’s easy to imagine how such flooding, combined with the salty sea air, would create a unique set of challenges. This problem looks to be worsening as well, so much that U.S. Army Corps of Engineers proposed a 20-foot seawall to protect the City of Miami. It will come as little surprise if, in the years to come, more Florida buildings will be found unsafe and evacuated as a precaution.

Flooded roads during July in Miami Beach. These minor floods were surprisingly frequent. I took this video the day I had to walk to my local dentist for a pretty wet appointment. Video by the author.

The perspective from my rooftop

A photo I took from the roof of my building in Miami Beach, about 2 miles from the Champlain Towers, with the City of Miami in the background. Dramatic skies and sunsets are typical for the area. Photo by the author.

The Surfside catastrophe had a personal effect on me. For 3 years, I lived in a similar building in Miami Beach, just 2 miles south of the disaster site. My home was situated on the same stretch of beautiful beachfront property running along Collins Ave, all the way from Sunny Isles down to South Point at the tip of Miami Beach. I’ve walked by the Champlain Towers many times.

A photo taken off my balcony of two giant cruise ships sitting moored with skeleton crews during the COVID-19 pandemic on April 1, 2020. Photo by the author.

Even as the rescue operations shut down and switched into recovery mode, the effects of the collapse have already begun impacting the community. People who live in high-rises are scared for their safety. Homeowners are scared for their property values. Already shaken by the COVID-19 pandemic, everything in the area feels uncertain.

Security cameras are a common sight in all areas of South Florida high-rises. Photo by What Is Picture Perfect on Unsplash

One thing I kept thinking about from my time living in Miami Beach, is that every high-rise in South Florida has cameras everywhere — even in the parking garages.

This means that using artificial intelligence, we can create an early warning system for structural problems in buildings using existing security cameras.

AI is already saving lives in medicine

When you think of “artificial intelligence”, any number of things could come to mind. You might think of computers defeating chess grandmasters, dancing robots from Boston Dynamics, or even a computer that is capable of being interviewed. However, there are lots of subsets of AI, and some aren’t that hard to conceptualize.

One part of AI is known as machine learning, and it works somewhat similarly to how humans learn. If I told you that I wanted you to learn how to identify lymphoma by looking at CT scans, how would you go about it? You’d likely ask to be provided with several examples of CT scans with cancer, and several examples of CT scans that do not contain any cancer. Maybe given even just a handful of these examples, you could begin spotting tumors in medical scans. This is where humans have the advantage — with just a few examples, we’re able to spot visual cues that lead to conclusions.

Esophageal cancer, CT scan with contrast, coronal image
Esophageal cancer, CT scan with contrast, coronal image (Photo by Tdvorak on Wikimedia, reposted under Creative Commons Attribution-Share Alike )

Machine learning works similarly, but with a lot more pictures to learn from. Instead of just a few images, you tell the computer, “here are a few thousand scans of cancer, and here are a few thousand scans of patients with no cancer.” The computer then chugs away and produces a “model”, which can look for any number of differences between cancerous and non-cancerous examples.

This data used to train the computer is conveniently called the “training data”, and the more that is available can lead to surprising, unexpected behavior where computers are able to detect things that humans are not. The most important point here is that this technology is already in wide use for medical scans, and even can improve their quality.

We can use a similar method to detect problems in concrete foundations

The AI model I trained detecting damage in the ceiling of the Champlain Towers South garage. Photo by author, original video footage of Champlain Tower South: visiting unit 611 on July 17, 2020 — Surfside FL condo collapse by Dr. Fiorella Terenzi

Think of this again as a human. How does someone notice if something is wrong with the concrete foundations of a building? Visual cues are often the first indication. Are there rust stains? Are there cracks? Are there pieces falling off or any noticeable deterioration? Failing concrete can yield clues that experienced engineers would notice — experience gained over years examining both failing and structurally sound buildings. John Pistorino, the man behind requiring 40-Year recertifications for buildings, said in a recent interview that “concrete gives you a warning. It gives you a warning. It doesn’t fail that fast.”

A human can’t sit and stare at a wall all day and night. A computer can.

An example of serious “concrete cancer” where the inner rebar is exposed and rusted. The appearance of damage like this could imply immediate danger to the structure of a building. Photo by gratuit on freeimageslive.co.uk licensed under a Creative Commons Attribution 3.0 Unported License.

Years of noticing visual cues is something that can be approximated by training an AI with thousands of images. Interestingly enough, a common term for describing the rusting of steel rebar contained within a concrete slab is known as “concrete cancer”. As the steel corrodes, it expands and causes the surrounding concrete to become brittle and crack, accelerating the process of erosion. Telltale signs of concrete cancer are concrete “spalling” — cracks and peeling away of small particles, as well as rust stains that appear to seep out from the inside, and bubbling and leaks that appear overhead. If the interior steel rebar is exposed, that means the problem has become extremely serious.

Terms like “concrete cancer” and “concrete spalling” can be very powerful when combined with image searches to assemble large training sets for an AI model to detect it. Below, I used Clarifai’s online portal to create a model using images I found through the search term “concrete cancer.”

Training the model using images of damaged concrete. Photo by author.

Remember what I mentioned earlier about Miami high-rises having cameras everywhere? Those cameras feed into a closed circuit monitoring system that is displayed at security desks. Those cameras could also be fed into an AI system that is constantly checking for any signs of structural damage every moment of every day and night. The system isn’t magical; if you can’t see a problem, then likely the computer can’t either. The difference is that the computer is able to watch the walls for damage 24/7 without ever taking its eyes off of them — something that is impossible for a human.

The other advantage of using existing cameras is that residents are both comfortable with their existence, and embrace the security they offer. Having the cameras also keep an eye on the resilience of the structure itself brings that much more confidence to a nervous public.

The AI model I trained detecting damage underneath a balcony of the Champlain Towers South, as viewed from Google Street View. As of the writing of this article, this is still visible using this link. Video by author using Google Street View.

Another way an AI could find problems in concrete foundations —change detection

An AI model monitoring for structural damage can also play “spot the differences”, a familiar game for children.

Take a look at the busy photo below. There are 8 differences between them; a computer would spot all of them in a fraction of a second. How long does it take you?

Original photo by Christopher Burns on Unsplash. Edited by author to remove certain objects from right side.

Comparing two images and detecting the differences is something that computers excel at; with a stationary camera they notice changes much faster than humans. Given the plethora of cameras already installed in South Florida buildings, a system could monitor these feeds and detect any change in the building itself. The only challenge in such a system is the ability to distinguish between the concrete wall of a building and cars or people passing by, but that is easily doable with modern technology.

This idea — change detection — is an important concept for an AI model. Even if a concrete structure doesn’t display any signs of damage, it shouldn’t move. The solution is straightforward — a camera watches a wall, and if the wall changes in any way, people should be alerted.

One thing I knew from living in Miami Beach, is that every high-rise in South Florida already has cameras everywhere — even in the parking garages.

Building the model

I’ve been able to train a basic model to detect damaged concrete in very little time. A number of their engineers have been extremely supportive and interested in this project, and worked with me to create the working prototype. I was able to put together a training data set of 300 photos of damaged concrete, and 173 photos of undamaged concrete. This is a rather small data set for a project of this scope, but it was sufficient to build a fairly accurate prototype.

The need for quality training data is frequently a blocking issue in AI. If you’ve ever used facial recognition on Facebook or Google, you’ll probably have noticed that it becomes more accurate over time, as you let it know when it’s correctly tagged your face and when it hasn’t.

Viewing the training data. Video by author.

The same is true for images of “concrete cancer” and “concrete spalling.” By far the most time consuming part of creating the prototype detector model seen in the video was the collection of training data. The more images I can get of both damaged and undamaged concrete, the better the model will become. If anyone has access to large numbers of photos of damaged concrete and willing to share, let me know! The actual training of the model itself was accomplished overnight while I slept.

My prototype model works fine with the footage of the Champlain Towers’ garage. All that’s left is to try it out on a live camera feed, and testing a system that could save lives in the future.

The AI model prototype in action recognizing damaged concrete. Construction video source: Beaches Construction Co, detection video by author.

For more information, follow me here on Medium or email me.

--

--

Looking for ways to use AI to improve the human condition. Software development, video production, occasional comedian. Currently wrestling gators in Florida