Understanding Technology's Role In Creating Successful Data Cultures
A robust data culture is critical to an organization's success, yet many organizations struggle to create one. Several factors contribute to a strong data culture, and technology is one of them.
In 2023, Chief Data Officers and Chief Technology Officers will be uniquely positioned to tackle data culture challenges by working together to address some interlinked data and technology concerns. CDOs and CTOs understand the right tools can help turn data into actionable insights that drive decision-making - but there are other ways that technology can help make your data culture a success.
In this post, we'll dive into three critical ways:
Setting the Tone with Smaller, Purpose-Built Data Warehouses
Reimagining Self-Service Analytics
Increasing and Encouraging Collaboration
Setting the Tone with Smaller, Purpose-Built Data Warehouses
The current approach: Enterprise Data Warehouses (EDW)
EDWs provide a centralized location for storing an organization's data. The visibility they provide into an organization's data helps set the tone for a data-centric culture, where everyone understands the importance of data and how they can use data to improve decision-making. But despite using these enterprise data warehouses, data cultures are still not where they need to be.
The root issue: EDWs Can't Keep Up
Enterprise data warehouses are essential to meet the needs of large enterprise-wide efforts. But getting data out of them and into decision-makers hands is still a more complex and lengthy process than it should be. Real-time, quality data is becoming essential to generating value, and data warehouses must begin reflecting the same characteristics of the data they are expected to hold. These data warehouses must be more agile, flexible, and targeted toward individual use cases and users so that users aren't bogged down waiting on others within the data lifecycle to get the information they need.
The solution: Data Applications
In 2023, consider looking into Data Applications as a simple but effective way to spin up and distribute more targeted data warehouses to users. Data applications are purpose-built data solutions that are more agile and flexible than large data warehouses, allowing them to blend and store use-case data that is more finely tuned to the needs of use-case stakeholders. These applications also deliver accompanying data and analytics capabilities that allow stakeholders to do data work and analysis close to the data vs. doing this work in data isolation. By providing simpler, targeted data solutions, work can progress faster, collaboration between stakeholders can improve, and stakeholders can create new processes to help streamline operations.
Reimagine Self-Service Analytics
The current approach: Tableau, Power BI, Alteryx, etc.
While smaller, purpose-built data warehouses can help set the tone for a data-centric culture, it's also essential to encourage self-service business intelligence. This is where employees at all levels of an organization are given access to the tools and training they need to explore and analyze data on their own. Traditionally, we see this approach done through tools and systems such as Tableau, Power BI, Alteryx, and more.
The root issue: Not All Self-Service Is Created Equal
Let's quickly revisit the previous sentence: "...This is where employees at all levels of an organization are given access to the tools and training they need to explore and analyze data on their own." While this may sound good on paper, the reality is that there are many different definitions of what "access," "Tools," "training," "exploring," and "analyzing" data looks like for people doing it "on their own."
CDOs and CTOs must understand that deploying hundreds of software licenses - like Tableau or Power BI - across the enterprise might look and sound like deploying self-service analytics; the reality is that these tools often hurt self-service analytics more than they help. It can hurt data democratization and the entire data culture at large and can do so in two key ways:
At a micro-level: Tools like Tableau and Power BI are reserved mainly for individuals that are specialized and skilled in these tools. When looking at who needs to use real-time, quality data, these technical users comprise a relatively small portion of the enterprise population. There is a massively underserved population of business-facing users making decisions every day that are either constantly waiting on data or not even supported by data. Making data easier to access, manage, understand, digest, and analyze for these users is essential. It's unreasonable to expect everyone to become a Tableau guru. It's not unreasonable to have data meet people where they stand and not the other way around.
At a macro level: These tools have a narrow focus on supporting individual analysis for individual users. They carry no responsibility or incentive to factor in the bigger data picture, especially at an enterprise level. So when they are individually cobbled together to meet more extensive collaborative data efforts, they exacerbate data problems for the organization. Such problems include:
Creating multiple versions of the truth
Adding complexity to internal data environments
Preventing access to data
Slowing down decision-making
Hindering collaboration efforts
Increasing cybersecurity risk
Increasing costs to the organization
These problems become most apparent with cross-departmental or cross-organizational efforts where multiple stakeholders are working toward a common business objective. It is essential that these stakeholders work with a unified, consistent, and accurate set of data, ideally one that updates in real-time. But with a tools approach like Tableau, this approach is nearly impossible. This is because this type of self-service data initiative is not backed by a degree of centralized management in the form of unified data models and agreed-upon definitions and measures. As a result, it creates a profusion of reports and dashboards, leading to conflicting and overlapping versions of the truth. Additionally, the information generated from these types of initiatives are typically stored away within spreadsheets, never to be reused again.
Considering how much this exact approach is replicated across an enterprise daily, it's no surprise that it no longer makes financial or operational sense for enterprise organizations.
The solution: Raise Your Self-Service Analytics Standards
Look to redefine your understanding of self-service analytics in how it impacts individuals directly and your organization on a larger scale. Empowering more users to take action with data in a way that feels familiar and easy is critical. Requiring extensive training or maintaining harsh dependencies on others doesn't increase the sense of community needed to create a successful data culture. So don't just think about what these capabilities look like - think about where these capabilities exist and where self-service analytics is taking place.
For individual users: Provide advanced yet simplistic self-service data & analytics capabilities that can empower users to do more with data. Such capabilities include:
Intuitive processes that users can implement in just a few clicks
This will help free up time for data-skilled individuals to focus on move-the-needle projects while empowering business users to securely work with accurate data and present it in the format required to tell their data stories.
For the organization: Look to implement these simplified capabilities within centralized, tightly-controlled environments backed by centralized data models. In this way, you can start to:
Create single versions of the truth
Reduce complexity of internal data environments
Open up access to data
Speed up and improve decision making
Enhance collaboration efforts
Decrease cybersecurity risk
Decrease costs to the organization by reducing licensing fees
Increase and Encourage Collaboration
Collaboration is key to any successful organization, yet many organizations struggle to get employees to work together effectively. When creating a successful data culture, it's essential to empower strong and effective collaboration between people and the data they are working with.
Current approach: Static, Rigid, Manual Sharing
The two trends above pave the way for the current approach to data collaboration. A common pathway often looks like this: an individual user pulls data from EDWs into Tableau to generate a specific data asset or piece of insight, then emails, presents, or slacks the information to others in their team.
The root issue: Disconnect
The most robust cultures are rooted in connectedness and togetherness. A major component of this togetherness is a mutual, consistent understanding of how the information within that culture is distributed and absorbed. Because the data architecture itself is disconnected and because users establish preferences and specific approaches, it ends up that this disconnect is carried forward around data collaboration. The result is delayed inaccurate data communication and collaboration.
The Solution: Connected, Collaborative Digital Environments
The key is to reduce disconnect wherever possible. Look for disconnect not just across data and systems but across people and processes. Many organizations use collaborative workspaces like Google Drive or Sharepoint to harmoniously and consistently store, manage, organize, distribute, share information and kick off workflows and processes. Look to curate similarly centralized, tailored experiences for users around their data, allowing for quick, simple, and easy sharing between teams. Sharing capabilities should be tightly controlled yet effortless and intuitive when permitted. The benefit of converging people, processes, and data into a single collaborative environment also means that strategies and management of the data will improve over time, as users can better fine-tune the environment to meet their needs.
Technology plays a vital role in building a thriving data culture. CDOs and CTOs should think about how current approaches impact individual users and how they can impact the organization at large. By distributing more agile, purpose-built data warehouses, reimagining self-service analytics, and overcoming the disconnect around collaboration, technology can help organizations overcome some of the biggest challenges they face when it comes to creating a strong data culture.
Learn how Process Tempo can help you achieve a stronger data culture (and solve many other data problems) with our Data Applications. We combine graph data warehouses, next-gen self-service analytics, and game-changing collaborative environments into a single, streamlined solution to ensure that your technology is working to make your data culture efforts a success. Get started today >