Great Lakes Analytics exists because that pattern is repeatable, and so is the fix. This is the work, where it came from, and why it's structured the way it is.
Across 8 years working inside Ingredion, a Fortune 500 food and beverage manufacturer, in regional and then global roles spanning digital transformation, demand planning, master data management, and analytics, the same pattern surfaced in every environment.
Organizations invest in the right technology: ERPs, MES, analytics capabilities, IoT, and AI. But decisions were still slow, ownership was still unclear, and the data was still being debated instead of acted on. The data exists. The trust behind the data doesn't.
The work of connecting IT systems to business users, translating data value for executive and C-suite leadership, and bridging global teams across regions made one thing undeniable: the gap wasn't technical. It was structural. Governance was fragmented, KPI ownership was undefined, and the decision cadence that should have moved things forward was spending its time reconciling numbers instead.
That observation became a framework. Not a better dashboard. A better environment for the data to operate in. One where trust was designed in, ownership was explicit, and the path from data to decision was clear enough that a VP didn't need to be in the room for it to happen.
That's why Great Lakes Analytics exists. Not to add another layer of technology to an already complex operation, but to fix the layer that determines whether the technology performs at all.
Stephen Gnidovec spent 8 years as Global Operations Analytics Manager at Ingredion, a Fortune 500 food and beverage manufacturer operating globally. He worked in one of the most operationally complex environments in industry, where data had to perform under real margin pressure, across global teams, in conditions where slow decisions had direct consequences on supply chain performance.
Working across regional and global roles, he developed the discipline of proving data value to executives and C-suite leadership, not just presenting it. That required operating fluently between operations teams and boardrooms, between IT infrastructure and business users, and across organizational levels where the language of data had to translate into the language of decisions.
That practitioner background is the foundation of every Great Lakes Analytics engagement. The frameworks aren't borrowed from consulting playbooks. They were developed and stress-tested inside the same kind of environments your operation runs in.
Stephen holds an MBA and an M.S. in Data Science from Elmhurst University and earned his Lean Six Sigma Black Belt inside a live operating environment. He designs and teaches Data Science and MBA courses at two universities across undergraduate and graduate levels, developing the next generation of digital workers and data-driven leaders from beginners to advanced practitioners. He is the author of The Data Culture Handbook, which documents the methodology in full.
The frameworks don't stay on paper. They're presented, debated, and tested in front of practitioners, executives, and academics who push back on them.
Chicago, IL
Presented and chaired the Food & Beverage Smart Manufacturing Summit, bringing together operations leaders and technology practitioners at the intersection of Industry 4.0 and operational execution.
Elmhurst University
Invited as an expert panelist to address how AI is reshaping professional roles and what practitioners and future leaders need to do to stay ahead of the shift.
▶ Watch the recordingThe original diagnostic work focused on a single recurring problem: data trust fracturing across enterprise environments. That work became the Data Trust Index, a structured framework for identifying where confidence in data breaks down across governance, systems, and stakeholder alignment.
As engagements moved deeper into Smart Manufacturing, a new dimension emerged. Industry 4.0 investments were stalling, not from lack of technology, but from the same fractured data environment the DTI was designed to address. The difference was scale, complexity, and the specific demands of connected operations.
The DRIVE Index was built to close that gap. It expanded the diagnostic framework specifically for manufacturers navigating Industry 4.0, adding the five dimensions that matter most when operational technology, IT systems, and human decision-making have to function as a single environment.
Can you trust what goes into the system in the first place?
When the pressure is on, does your team trust the numbers enough to act?
When two people pull the same number, do they get the same answer?
Can your team make a decision without stopping to reconcile first?
Does what gets decided actually change what gets done?
The Data Culture Handbook captures the full body of work behind every Great Lakes Analytics engagement: the diagnostics, the frameworks, and the operating principles that make data investments perform in real manufacturing environments.
It exists because the problem is repeatable and the solution is structured enough to document. The book isn't the service. It's evidence that the method holds up under scrutiny.
A 60-minute structured diagnostic session with a Data Trust Report and Roadmap as deliverables. The $1,500 fee applies toward any full DRIVE Index engagement.
Book Your Diagnostic Call