Change has been around every corner in 2020. We’re creating and bombarded by data (it’s everywhere!) and yet many feel like they’re pedaling backward to use that data to manage change – especially in business. To gain some perspective we recently “Zoomed” with Paul Young, General Manager, Magnitude’s Data Integration Business Unit, asking him about managing change into and beyond 2021, and the importance of reinventing data connectivity to meet evolving customer needs.
What are some challenges that accelerated this past year in the data world?
“If you don’t like change, you’re in the wrong business” is no longer applicable only to the technology industry. 2020 has shown us that there’s no such thing as stability – no matter what industry you’re in. Take the beer industry, which many consider to be a very steady and predictable business. This year, a bevy of beer makers precipitated a worldwide shortage of aluminum cans for a large number of major canned beverage suppliers. Triggered by the surge in demand of home-bound consumers ordering canned beverages, breweries had to pivot from selling most of their beer in large kegs to bars and restaurants to canned products, thus creating the shortage. This is just one example of the dramatic change in consumer behaviors brought on by the pandemic that have disrupted supply chains across industries.
You can’t stop change but you CAN manage it. To manage change, of course, you first need to understand it. This starts with having access to data and analytics for timely intelligence to not only manage a potential disruption, but also to understand the broader implications of changing consumption habits and the remote work trend. As such, the winners will be the those that define their new normal — to see change and react swiftly. Success and failure are incumbent upon your speed to reaction.
What changes are you seeing in the market?
The pace of change, or what I refer to as the “half-life of technology,” is constantly shrinking and businesses are propelling a giant pendulum swing in new technology adoption. There was a time when Oracle was the only database we allowed in the datacenter, then along came SQL Server, Teradata and Hadoop … then RedShift, Big Query and now Snowflake. Same applies to ETL as well as the analytics space. In analytics, for example, Business Objects used to rule the roost, but now we have Tableau, Qlik, MicroStrategy to name a few. All have their specialties and all are being adopted. The good news is, where we once had one simple tool, we now have a plethora of tools. But the bad news is that this does not necessarily make life easier. What we are not seeing is a transition between tools, but rather a constant addition of tool usage driving an ongoing replication of the data. This is resulting in, if I put it candidly, diminishing returns on results as organizations grapple with technology absorption.
In parallel, we have the continued migration to the cloud and SaaS as datastores. Where we used to buy applications and databases separately, today, as we move to SaaS, it’s not always clear what the database is. This speed of technology adoption has created a world that’s increasingly fractured. And we’re not replacing, we’re adding to and retaining what we had. This technology abundance is characterized by continuing growth in cloud migrations and data stores. More and more data is moving to the cloud. But all too often, that same data simultaneously exists on-premises and in the cloud, resulting in multiple copies of that data. In almost all of these instances, we’re seeing this pendulum swing based on financial decisions — with software players overplaying their hand, rather than answering the technology need. Unless a business requires tremendous extra compute that structure is often superfluous.
The more valuable solution to replication is to re-platform, but that is a much larger and riskier project, making it seem much easier to just copy data. We are, however, seeing baby steps taken in how to best manage migrations. And increasingly, I’ll note, a lot of functionality and essential data businesses need to run is found in SaaS applications. Still, a lot of those don’t have good analytics solutions. So, it sounds like a bit of a mess. But in reality, this is probably a new normal. The pendulum will settle, but not in the place where it once was. Companies will have multiple databases, multiple applications and multiple analytics capabilities, and they’ll be testing and discarding a lot more technologies at a quicker pace.
Add to that, where once business required ownership of their databases, today AWS and Microsoft have captured a tremendous share of new database spend. With the market dominance of these players, we naturally hear a lot about customers concerned with cloud lock-in and they are looking at technologies to manage that risk, hence we’re seeing the amazing growth of Snowflake. The winners will be the businesses that can get their heads around all these trends structurally, for the longer term.
Consider as well the long-term implications brought about by the pandemic. Touchless, contactless, remote and curbside delivery—consumers have embraced new digital behaviors, and when change takes place for more than a year, it rarely goes back. We’re seeing this trend unfold in the retail industry. Armed with higher quality data on consumer shopping habits – on what they want to buy and not just what’s available in-store, retailers have the opportunity to design new experiences to build customer loyalty. Driven by data, a big box retailer can innovate and transform their stores to become mini warehouses to compete with Amazon.
What are things that companies and people need today?
Despite the exponential growth of applications and data sources, users actually have less access to the data they need for real-time business analysis. This is an area in which businesses need the most urgent help—connecting enterprise applications and data sources – wherever they reside—to extract value from their data so they can survive and thrive at the speed of change.
In short, businesses need access to data to get answers to business questions. They need controlled and secure access to all their data and they also need performant and cost-effective access to all their data. Referring back to the previous question, this requires an understanding of what data is being accessed from where and determining the cheapest and fastest place to run it, or whether there’s a need for the data at all.
To adapt to change, businesses need the flexibility to quickly connect to a myriad of dynamic data sources as needed. As such, data connectivity should be scalable, simple to manage, seamlessly updated and secure—this is our value proposition for Simba connectivity.
How is Simba responding to and addressing these needs?
Now more than ever, trust and reliability are the currency for business and we’ve established our reputation as the highest quality provider. Beyond speed and price, quality matters for customers. With that in mind we’re honored to be the industry choice for standards-based data access. Our customers rely on us as the trusted partner to meet their demands for accessing and connecting to data from any data source. Today, we run trillions of transactions on our software for enterprises and software developers.
In the same way that our customers are adapting to new ways of working, we’re laser focused on bringing new connectivity innovations to market. For instance, with Magnitude Gateway, we’re reengineering how we can build connectivity to move at the speed of data. If there’s a change with Salesforce, we have the resources to integrate and certify all new drivers and data sources.