“How do we find out as much as we can about an occupied building that we’re about to refurbish without disrupting our tenants? We don’t want to annoy our customers before they vacate, with extensive opening up or verification works within walls, floors, risers and ceilings. But then when you start on site you’re going into the unknown and you only begin to uncover problems as you start work under contract. Is there a clever way to mitigate that discovery risk while maintaining an efficient programme?”
Alan Bunting / project director / British Land
Archaeology in the office
Shane Orme / graduate mechanical engineer / WSP/ UK
“The drone would be able to scan key plant areas, rather than someone having to go into the ceiling void and look around”
You could get a pretty good view of both a space and the building systems by using drones and 3D imaging. The drone could carry out a non-intrusive survey in the evening when the building was empty.
With a laser scanning device, a drone could produce a point-cloud survey — a 3D map that would get more accurate the more time it spends flying around. Ground positioning radar (GPR) is normally used for finding things under the ground, like pipework, or for archaeological surveys, but I don’t see why you couldn’t do the same thing for walls and ceiling voids. It works on reflections, so the biggest signal you’d get would be from the wall itself — but you could filter out the signal to look at what was behind it. The drone would be able to scan locations of key plant areas — for example, fan coil units in the ceiling void — to check that they’re all in the right place, rather than someone having to actually go up into the ceiling void and look around.
The biggest challenge would be to relate that to the out-of-date information you’ve already got — so mapping it onto the as-built BIM model. That’s okay for the point-cloud survey, but the interface between GPR and BIM would have to be developed. The GPR survey would just show a collection of points or a line, and you’d have no idea whether it was a duct or a pipe or a wire or a beam. So you’d have to overlay the BIM model and spot the difference. It would be very complicated, but you could potentially train a computer to compare the GPR survey with the existing BIM. You could use AI or neural networks — or a human could probably do it.
A service robot with a hidden agenda
Rachel Kennedy / smart buildings specialist / WSP / US
What if there was a robot that delivered coffee in the office? It could learn about occupants’ patterns and the conditions of the spaces and non-invasively inform the design and the improvements that need to be made when they leave.
“Just by tracking coffee delivery patterns, we would know the paths that people usually take and the high-traffic areas”
I work in the smart buildings group, and a lot of the time we focus on how comfortable a building is for the occupants. In the BOLD&R Lab, we have a mesh network of multi-sensors across our entire office that measure temperature, light levels and sound, and track people’s locations — but they require installation, which takes time and can be invasive. Instead, we could install the sensors on a robot which could measure conditions across the space with the same granularity as it moves around. If we find there’s low indoor air quality, that could indicate that pollutants have been used or that there might be mould within the walls. An airflow sensor could tell us if there’s even air distribution throughout the space or not, and we could produce heat maps which could show any areas that are particularly hot or cold to flag potential problems with the building systems. Just by tracking coffee delivery patterns, we would know the paths that people usually take and the high-traffic areas — where someone would usually have been walking to the coffee pot — which could indicate wear and tear that needs to be fixed. It would also show where the robot never goes, indicating that perhaps the space is uncomfortable — there might be too much glare — or that it could be used in a more valuable way. All this information could help inform other conversations with tenants.
A coffee robot should be feasible with the technology that we have. There are already plenty of service robots, though it would need a station where it could fill up with coffee or to be able to use a coffee machine. An app would probably be the easiest way for people to call the robot — you could just order your coffee as you would on the Starbucks app and drop a pin to tell it your location, or it could find you using Bluetooth sensors. It’s important that the robot is perceived to be being doing something useful, because a lot of the time people don’t want to use technology or they feel like they’re being spied on until it becomes an asset or adds value to their lifestyle. If it’s delivering something that people in offices tend to want, it’s no longer seen as an inconvenience while still capturing the information that we need.
The 3D scanner in your pocket
Roneel Singh / technology lead (ANZ) and smart building specialist / WSP / Australia
What if the building owner provided all the tenants in the building with a new and improved smart card, for building entry and access and to improve their experience within the building? The smart card could be embedded with the ability to scan small areas or short distances similar to the sensors associated with autonomous vehicles. Each tenant then becomes a mini drone for the building owner; as they walk around the building they scan the surrounding areas to provide data and analytics about the building. The more sensors you embed into the card, the more data you receive.
“With a validated 3D model, we could move through the digital space using virtual reality without impacting the occupants ”
For the services that cannot be seen, like cabling and air conditioning in the walls and ceiling, the cleaner’s vacuum could be fitted with a module containing a range of sensors to “x-ray” the building as the cleaner does their job. The “x-ray” would continuously build a 3D model of the hidden services showing their location within the building. Algorithms will categorize and catalogue them according to the systems that the components serve. The sensors will be sensitive enough to determine material characteristics such as thickness and composition. Hazardous materials will be automatically flagged and reported.
All of the data can be uploaded into a central data lake and then cross-referenced with multiple sources to provide a validated data set for the building. Simple algorithms could be set up to validate the quality of the carpet or flooring based on acoustics or vibration data, the quality of lighting and glazing based on lighting levels, or the efficiency of HVAC systems based on noise, vibration and temperature set-point fluctuations. Everything will be automatically tagged to build a central asset and condition register.
With the 3D model built, analyzed and populated with the tagged assets, we could move through the digital space using virtual reality without impacting the existing occupants. Prospective contractors will be able to carry out virtual site inspections and assess all the services without the need to go on site. Safety will be improved through this process. Ladders won’t be required to access ceilings and exposure to hazardous materials can be avoided. Planning and assessment of works can be discussed collaboratively, with the risks clearly identified and no hidden surprises lurking in the ceiling.
Combining advanced sensors, automated analysis and virtual reality in this way will enable the assessment and management of the physical environment without any need to disrupt the current occupants.