Uncovering cracks in investor confidence
In this episode, Dianne Crocker from LightBox sheds light on investor sentiment and cracks in the foundation of…
LIGHTBOX CONNECTS
Patrick Jacolenne
Founder & Chief Data Strategy Officer,
DATALOGIQ 360
Companies making the biggest strides in the CRE ecosystem have figured out how to leverage data to drive better decisions. But...
Michael Griffin: so, Patrick, the commercial real estate ecosystem is this giant trillion dollar asset class, and the companies that seem to be making the biggest strides figure out how to leverage data to drive their decision making. And this is really hard to do, though.
Michael Griffin: So what are some of the most common issues with data management that makes this so tough?
Patrick Jacolenne: Yeah. So the top 3 obviously challenges are issues that we see and encounter when engaging with multiple parties across the the lending and commercial real estate ecosystem is number one just consistency. There’s a lack of consistency in the way that the data is sourced, managed, and then leveraged across those various entities and parties. And there’s a lot of exchange from information that’s originally digitized. Then, being put on paper, then back to digitization as well, so that creates significant challenges for all of these parties that are a part of a single transaction to effectively manage that process. So that’s number one, the consistency.
Patrick Jacolenne: The other is then timeliness of the data. So all of the various sources, 3,000 plus sources that could be a part of a single transaction, have varying degrees of freshness or timeliness of the data which can also cause issues, because that then can lead to consistency issues as well.
Patrick Jacolenne: So information that you’re relying on to make an informed based decision around the valuation of the property, the market share, market economics, etc., can present challenges. And if that data isn’t timely, you’re making those decisions, perhaps based off of stale data, 6, 12 months old. And we saw the net effect of that during the pandemic in the past 3 years, and then the third is just around general standardization. So the industry is trying to collect and make available publicly available data from those parties.
Patrick Jacolenne: But the challenges moving forward. There’s a big opportunity around standardizing proprietary data. That is the information that the lenders are sitting on looking at historical information, and how they marry it up with third-party data today they rely on third-party data to be the author author of source versus their own proprietary data being the author authoritative source, and how they augment that with alternative sources that enrich and create greater insights that unlock them, that ability to make decisions around what has, what is or what will happen.
Michael Griffin: The last couple of years have totally upended a lot of the models in underwriting philosophies on the investment side of the business, particularly with lenders. So if you had to pick. What would you consider to be the biggest hurdle for a commercial lender related to property data?
Patrick Jacolenne: Yeah, I’d say over the past 3 years, as you mentioned the pandemic, and it’s related events have really had a knock on effect for lenders, and part of that is really pushing towards digital transformation. But part of the challenges Many financial institutions continue to use manual lending processes that are overly complex.
Patrick Jacolenne: Add a lot of cost to the transaction, and it also create a lot of unnecessary delays. And you know some of the data points that that we’ve seen around this when we go in and survey some of our lenders and financial institutions is if not larger than 50%, are still relying on manual documentation, meaning handwritten documentation that they then have to transcribe into a a digital source, and so that just creates a lot of unnecessary components.
Patrick Jacolenne: Complexity, the pandemic and traditionally how you would evaluate and assess a deal or a market opportunity is. Look at historical trends and patterns. The reality is, the pandemic was no historical trend or pattern.
Patrick Jacolenne: It was an anomaly, and so that is where this challenge of relying on manual processes and needing rapid quick decisions and informed decisions to look at behaviors, trends to study the market, and then determine how you need to kind of pivot and focus is that biggest challenge. So I would say again, just to highlight what the pandemic really put a spotlight on. It is really the manual processes, the inconsistencies between those processes, and still the opportunity that’s out there as far as market share to digitize.
Michael Griffin: And I hear a lot about the concept of persistent IDs and connected data. So what what’s all that mean? And why is it so important to our industry?
Patrick Jacolenne: Yeah, I think in a in a nutshell, a persistent ID offers an opportunity to create a single source of truth for the data. So take 3,000 plus various aggregated data sets, and being able to connect them with that unique Id that creates high degree of trust, confidence, and also then the ability to assess that data in a timely manner. I think of it as very much a social security number. Social security number gives us each unique, ID, but it also connects the relationship of us by first name, last name, address, perhaps previous addresses relationships to other individuals, spouses, nieces, nephews, and so that then creates this collage, if you will, of data points that that help inform us around who we are as an individual creating that trust, and then also the benefit from having that persistent, unique. ID is one. There’s a standardization.
Patrick Jacolenne: So imagine having a single property record that has the same look and feel based on that persistent ID. And there’s no translation You don’t need to reset a stone to be able to translate what I looked at yesterday versus what I’m looking at today because the property records still have the same look, and feel the address, because could be very different in in the profile of the property. But again, that standardization really helps. I think us being able to evaluate and then understand the property itself, and then
33
00:06:43.830 –> 00:06:54.250
the ability to integrate and fuse multiple sources, as I mentioned, is one of the I think, the primary advantages of having this persistent. ID both internal and external sources.
Patrick Jacolenne: And so you know that can help. Then deliver an incomplete view of the property. You know. Everyone, I think, is after that’s kind of the ultimate. I think North Star is a complete view of the property. How can you get a 360 degree view, which then is complementary to all these various data points that are giving you primary characteristics about the property, but also secondary and tertiary?
Patrick Jacolenne: They are giving you information around vacancy, rent, sales, comparisons, tenants. So you imagine, as you go out to those secondary tertiary layers, you’re seeing new insights that might drive, you know, opportunities around vacancy rates or property values or rent, etc. there.
Patrick Jacolenne: And so that’s where the persistent ID comes into play. And I think one of the ultimate challenges that commercial real estate deals with is the various types of data, and I think this is one aspect that is overlooked. And so you have your traditional commercial property data, which is the asset type, the location, the building size, the zoning lot, size, etc.
Patrick Jacolenne: Then you actually have transactional data. This the data that helps analyze the past and present value of a property in this potential return on investment a lot of times. That’s what the lenders, brokers and investors are using to analyze. Then you have your commercial loan and lender data. This type of data is really about the debt. History, mortgage amounts past and present debt and then associated with its pro parse on a property owner, and then you have the commercial tenant data. So it’s information about current and past tenants as well, and their lease expirations, etc., and then you have ownership data, so provides you the information about past and present and owners of the actual property. And then you have the market data itself.
Patrick Jacolenne: so cost of the property on the market, reasons for investing a certain building amenities, types, returns, etc. So you know, I rattled off like no less than 6. They vary different types of categories of commercial data.
Patrick Jacolenne: Now imagine trying to connect all of that data about a single property when it’s spanning over 3,000 sources very difficult to do without a persistent ID. With a persistent idea it creates that linkage.
Patrick Jacolenne: and then the ability to be able to create those relationships across those various sources, and then to go back to the earlier point around what are the 3, you know, factors that really affect and impact the commercial real estate market? When you think about one I mentioned, you know, consistency to standardization and 3 timeliness. That’s where a persistent ID can really drive efficiencies and optimization in those areas to be able to unlock the data.
Michael Griffin: I’m always amazed with the number and the complexity of of data sources that are available these days for properties. But it seems like you need a Phd to really make sense of it all. So what are some of the more common challenges that you’ve seen with attempts at connecting all this property data?
Patrick Jacolenne: Yeah, I think it is market adoption.
Patrick Jacolenne: I think it’s relatively easy to come up with a unique identifier and then a go to market strategy around that. But it’s then the adoption rate. How do you saturate that unique identifier into the market so it becomes really the the authoritative source?
Patrick Jacolenne: and really create that dependencies between all of the various entities and parties involved in the transaction to leverage that ID so the idea is being persistent across the actual transaction itself. And so there’s some creative ways, and that can be around being the standard or being viewed as the standards organization, as it relates to the ID, and really demonstrating the value that it unlocks. So it unlocks. You know the quality, the completeness, and the accuracy of a true, complete property record.
Patrick Jacolenne: And so you know that I think that is probably the single best, single, most challenging thing is, then the adoption and a lot of times. What companies are trying to do now is, do integrations. How do you integrate in the actual workflows where the point of decision is being made and leverage the ID there?
Patrick Jacolenne: So instead of passing bulk data back and forth between the transaction and API, you’re just passing the ID around, and then the ID is then just providing the data points that are necessary to be able to make that the point of you know Point of time. Decision.
In this episode, Dianne Crocker from LightBox sheds light on investor sentiment and cracks in the foundation of…
Hear from leading expert, Victor Calanog, as he reads the economic tea leaves and share his thoughts on…
In this episode, Michael Griffin talks to Julia Georgules for her perspectives on whether there’s a silver lining…
Fill out the form below and a member of our Sales Team will contact you shortly.