Blog & News

7 Takeaways from the Gartner Data and Analytics Summit

By Amy McCloskey Tobin | March 23, 2023
7 takeaways from the Garnter Data and Analytics Summit

As always, Gartner delivered when it came to analysts’ views on the data science space for enterprise – showcased at the Gartner Data and Analytics Summit in Orlando Florida this past week. The conference was chock-full of brilliant data geniuses there to learn and share their knowledge. Here are my takeaways from the summit:

1. It is the age of Natural Language Technologies

Bill Gates recently penned a fantastic piece calling this the age of AI, but I have a different view. AI has been here for years – all of us have become accustomed to chatbots, fitness wearables, and Alexa. What is clear to me, despite or because of the hype around ChatGPT and Large Language Models (LLMs), is that people working in analytics have grasped how important NLP, NLU, NLG, and NLQ are when it comes to understanding data at scale. Some LLMs have raised security concerns, whereas other language technologies can be used more securely behind enterprise firewalls. ChatGPT is fascinating to use and will make many of us more productive, but what is truly important to businesses right now is understanding their data – at scale and instantly. That’s what NLTs do. And they do it well.


2. NLG is the language technology to watch

Gartner referred to Natural Language Generation (NLG) in its hype cycle as a tech on its way up. Looking at the type of managers bringing NLG to the front of their enterprise tech, they are firmly in the “visionaries” category.  Arria NLG can integrate into any major Business Intelligence platform, or into a proprietary platform, and it does not persist data. Used this way, NLG can be used as guard rails for Generative AI.

Enterprises that use existing Natural Language Technologies to segregate their proprietary data will be able to use generative AI faster and more safely. ~heard at Gartner Data & Analytics Summit 2023

3. Dashboards are not dead

BI tools are foundational to dissecting and understanding data. More than one software company has hailed the demise of dashboards, yet businesses using BI platforms are five times more likely to reach faster decisions than those who do not. BI dashboards are still an essential tool across enterprises, with millions of dollars already invested in them. Instead of trying to kill them off, smart business managers are integrating NLG into their BI platforms so that every user – including non-experts – can clearly understand their BI visuals. It will be interesting to watch the BI wars continue – Microsoft has invested heavily in ChatGPT – and PowerBI is now the Goliath of BI platforms. However, talk to any Business Intelligence Manager and they will tell you it’s near impossible to get users to switch platforms – especially if they’re Tableau users – a ferociously passionate group.

4. The new focus for BI tools is the user

BI solutions are not going away, but they are evolving. Instead of focusing simply on capabilities that get data techies excited, the new focus is on the user experience. Again, this is where the integration of natural language technologies like NLG will make a huge difference in the amount of information – and how fast – a user can get from their dashboards. Give the user a plain language narrative explaining their data, and they’re lightyears ahead of users trying to decipher visuals as standalone. Data understanding becomes easier, faster, and more scalable while using a familiar platform.

5. Decision Intelligence needs to be in real-time

Not all business decisions are instantaneous processes. Strategic, long-term planning obviously takes time, reflection, and a deep understanding of business data. However, many business decisions need to be made in seconds. Resource allocation and operational decisions usually happen instantaneously. If you’re going to use AI to make business decisions, that AI needs to be accurate and instantaneous. Again, I go back to my “it is the age of NLG stance.” NLG provides situational awareness around your data, what is happening in your business, and what is likely to happen.

6. Data prep is critical

Without proper data preparation, the data in a business intelligence platform may contain errors, inconsistencies, and missing values that can lead to incorrect insights and decisions. The quality of data in a BI tool directly affects the accuracy and reliability of the reports and analyses generated from that data.

Effective data preparation helps identify patterns, trends, and relationships that might not be immediately apparent. By cleaning and standardizing data, analysts can draw more accurate conclusions and make informed decisions based on the insights they gain from it. The more data coming at us, the more critical data prep becomes.

7. Governance is even more critical

Generative AI poses unique challenges when it comes to governance. Generative AI can produce content that is difficult to distinguish from that created by humans. This raises concerns about the potential misuse of generative AI, such as the creation of deep fakes, false news, and other forms of digital manipulation.

Governance is critical to ensuring that generative AI is developed and used in a responsible and ethical manner. It is critical that AI systems are transparent, explainable, and auditable. Developers and users of generative AI should be able to understand how the system works, what data it uses, and how it generates content. Transparency is important for building trust in generative AI and for detecting and addressing potential biases or errors.


The Gartner summit was inspirational for anyone working in the AI space. Those of us working specifically in NLG came away understanding that it is our time. ChatGPT and the other large language models made the world sit up and take notice of generative AI and language technologies. If you want to learn more about how Arria NLG can integrate into your existing platforms, check us out here.


MORE BLOG AND NEWS