The role of research in the commercial real estate arena is not only to access and collect data, but also to synthesize said data into a presentable and digestible analysis for an eclectic group of audiences. Over the last several years research’s importance within the industry has grown. In a world where data has become ubiquitous and almost every job in the industry includes a research component knowing how to properly examine and interpret data is a significant advantage. But, as with everything in the real estate space, the way we research is changing due to the influence of new technological tools. As a long-time research professional and current Head of Research at NAI Hunneman in Boston, I’d like to take this opportunity to discuss just how much the profession has transformed throughout my career.
Let’s kick off this discussion with a brief look into the research landscape when I started out in early 2006. As a real estate economist with PPR, now called CoStar Market Analytics, I was tasked with researching, analyzing and forecasting market fundamentals for several major metros in the Southeastern region of the United States. At the time, Microsoft Excel and Access were our go-to tools and the quantitative team utilized statistical software for econometric modeling. Major sources of information and data included local and national news outlets, government resources, commercial real estate brokerage reports, and subscription services like Real Capital Analytics, Moody’s and Reed Construction Data. Needless to say, we were always hungry for more data.
Fast forward to today, the industry is saturated with new sources of information, databases, tools and resources. Government entities, think tanks, public-private partnerships and everyone in-between have upped their data game. Open source software is now more abundant than ever, and technological advances have changed the way commercial real estate researchers procure, organize and analyze data. These new tools have also allowed those of us on the front lines to gain much-needed efficiencies, grow our skillsets and help our clients make better, more informed decisions. Many key aspects of the job have evolved most in the last ten to fifteen years that have changed the way we research real estate markets forever.
While staying organized was important back then, it is absolutely critical today. Information overload, analysis paralysis, infoxication – or whatever you call it – is a real challenge for the modern day commercial real estate researcher. Exponential growth among data sources, compounded by the 24-hour news cycle and permeation of social media, has had a material impact on tracking market intel. So, how do we control the chaos?
Back in the day I used a system dubbed the “Article Archive” (I can’t take credit for the name though). A rudimentary database housed news articles and other media we wanted to keep track of for reference later and provided our market research team with a basic level of informational organization. I still use a similar system today, but there are far more tools at my disposal. I’ve replaced the old archive with the mobile app Evernote, which doubles as a task list and note tracker. Using the Chrome and Outlook plugins, I’m able to quickly snip, save, manage and tag articles, emails, documents, and notes into one efficient system. The app is downloaded and synced to my computer, tablet and phone. Being able to easily retrieve articles from months ago or mark them to read later has been invaluable for my day-to-day workflow. If Evernote is not your thing, there are several alternatives available: OneNote, Simplenote, Google Keep, and Elephant are a few of the more popular apps that can help researchers manage market intel and news articles.
Other tools and resources my team and I use on the daily for organization include Trello, Google Docs, Dropbox, and Slack. These help teams like ours be on the same page no matter how much new information gets documented or changed.
Everyone loves a good, quality map. And while proper mapping techniques have been around long before any of us, Geographic Information Systems (or GIS as all us “cool kid” researchers call it) revolutionized and digitized the process in the late 1960s, with ESRI being a pioneer of the technology. The commercialization of GIS software took place from the late 1970s through the 1990s as advancements in computer processors continued to drive improvements.
Other “OG” mapping tools include MapInfo, Global Mapper and Geomedia. These systems house, manage, manipulate, analyze and present spatial or geographic data. Given that location remains one of the most important factors influencing property value, rents, and demand, the marriage between GIS and commercial real estate only seems natural. The industry has been utilizing these resources, particularly in the retail sector, for some time but location analytics are a necessity in today’s marketplace.
Similar to other data sources, the sheer number of mapping and GIS platforms has swelled in recent years. This is particularly true among web-based services and mobile applications. MapInfo/Anysite, a desktop program, has been my go-to GIS tool for several years now. Using this software, I can create sharp-looking maps and demographic analyses for brokers, landlords, tenants and the like. Yet, I also utilize Google Earth Pro, MapBox, MassGIS, MuniMapper and Python among other sources to create maps and display data. It’s not just programs and software either.
Mapping and GIS provide commercial real estate researchers with immense analytical power that help us identify trends, patterns and opportunities that otherwise may not be apparent. The number of resources currently available to the community is a true enhancement.While it takes a lot of brainspace to remember which resources to use to find the right data, I would never go back.
Data Crunching Tools
Like every other industry in the known universe, commercial real estate is moving toward highly-sophisticated, data-drive solutions. Given the amount of data and information we’re now working with on a daily basis, programs like Microsoft Excel and Access are just too generalized to support our current research efforts.
Here at NAI Hunneman we have implemented SQL-based solutions, like Microsoft Dynamics, to collect market stats, lease comps, lease expirations and a myriad of other variables. We can now quickly and efficiently organize, aggregate and analyze any and all data being collected throughout the company. New to our arsenal of analytic tools is Stata: a powerful solution for exploring, visualizing and modeling data.
With this new software, my team has been able to significantly reduce the amount of time spent on redundant processes (hallelujah!). Other popular analytic platforms that I’ve come across in recent years include Tableau, Domo and Qlick. Coding is all the rage; even tiny tots are learning computer programing these days. Having some coding skills will prove to be invaluable to today’s research professionals. Even though there are a number of different languages to choose from, all can use scraping (the process of automating data collection through code) to collect critical information from the public resources like assessor’s databases and industry directories. This data can be used to create charts and graphs systematically, and check large datasets for inconsistencies and discrepancies. From there we can great all of the great informative morsels that you like to talk about at cocktail parties and the data visualizations that you share on your social media feeds.
Artificial Intelligence Advancements
Simply put, AI is defined as intelligence demonstrated by machines. While the concept of AI is not new (who hasn’t seen The Terminator?!), it wasn’t until relatively recently that the commercial real estate industry began to embrace this advancement in technology. Machine learning certainly wasn’t something on my radar ten years ago. How is the industry using AI today and what does the future look like?
As previously discussed, big data is here to stay, and research professionals need every resource they can get to collect and analyze outsized amounts of data and information. AI undoubtedly can help with these functions, but where I see the most useful application is with predictive analytics.
As an industry, we’re always looking toward the future. Where are rents headed? Are vacancies expected to increase or decline? What are the expected returns for specific buildings? There are several startups developing AI platforms to help answer these forward-looking questions. Skyline plans to use hundreds of data inputs to provide expectations on returns for multi-family properties throughout the U.S. Enodo also boasts a similar platform for the multi-family market, aiming to identify potential investment properties, calculate market rents and other analyses. In the development arena, CityBldr is using machine learning to identify a property’s highest and best use, estimate anticipated returns and help with site selection. Services, like IdealSpot are also utilizing machine learning to deliver key insights and predictive location analytics to users. As a researcher, being able to input the data I’m collecting into a program that not only analyzes the information but also produces predictive results is a game-changer. As the industry continues to innovate, researchers will keep shifting more sophisticated technologies like AI.
The commercial real estate industry has transformed and advanced significantly since the mid-2000s, it’s undeniable. For those of us in research the expansion our set of skills to meet the growing need for better, more-accurate data and analyses has been one of the most fundamental shifts we’ve seen over the last decade or so. Yet there are still advancements to be made. Automation will remain at the forefront of future progress in the research industry. Tasks that would take a team of people days or weeks, may take one person only a few hours. Machine learning and artificial intelligence will also continue to disrupt the industry, changing the way in which researchers utilize data to inform decision makers. But, no matter how much data we collect or how well we design machines to analyse it, ultimately a qualified researcher will have to be there to make sense of it all. Here’s to the researchers, the unsung heros informing the entire industry from behind their wall of computer screens.