NextFutureToday - Episode 29: Jerk, Decision Loops, and the Realities of Enterprise Automation

Hi Next Future Today listeners,

This week’s episode is packed with big ideas and practical advice. Dan Keldsen (your host and Infocap.ai’s Chief Innovation Officer) sits down with Chris Surdak, Chief Transformation Officer at IRPA (the Institute for Robotic Process Automation & AI). If you’re curious about how to make your organization more agile, resilient, and successful in the age of automation and GenAI, this is a “can’t-miss” conversation.

What’s Inside:

Navigating the Hype & Reality of Automation

Dan and Chris both have long, winding, and unconventional careers in tech and automation. They swap stories from their Lockheed Martin and NASA days, discussing how paperwork, signatures, and bureaucracy used to slow everything down—until the dawn of Lotus Notes and the first waves of process automation.

Value of Information: Do You Really “Get It”?

Remember RadioShack? Turns out, the most valuable part of its bankruptcy wasn’t the storefronts or inventory, but its decades of customer information. Chris and Dan dig deep into infonomics—why information is the new currency, and why most companies still don’t treat it that way.

Chaos Theory, Fractals & Why Tech Projects Fail

Why do so many IT (and automation) projects crash and burn? Chris shares war stories about risk, resilience, and “unknown unknowns”—plus the power of planning for failure rather than just hoping for success. If you’ve ever been in a post-mortem meeting scratching your head, you’ll relate!

Is It All Just Hype? How to Spot Real Value in AI

It’s tempting to jump on the generative AI bandwagon. But should you? Chris’ advice: seek out the naysayers, listen to dissenting voices, and beware of “shut up, we’re making progress!” groupthink. If you haven’t started with AI and automation yet, maybe waiting and learning is the wisest move.

Coming Soon: AI Governance Playbook

Chris teases his upcoming book, which announces a comprehensive (and refreshingly honest) framework for AI governance: “AI is effectively ungovernable, but that doesn’t mean we shouldn’t try.” Guardrails, anyone?


Key Takeaways:

  • You can’t anticipate every failure, but you can build resilience.

  • The value of information isn’t just theoretical—it’ll decide which companies thrive or disappear.

  • Don’t get swept up by hype cycles. Look for patterns, listen for outliers, and remember: innovation thrives where you expect it least.

For More: Listen to the full discussion and check out IRPA’s global automation & AI community at irpa.ai.

Want to go deeper?

  • Look out for Chris’ upcoming AI Governance Playbook—and revisit his past work, including his book “Jerk.”
  • And grab a copy of Dan's book "The Gen Z Effect" - even 11 years later, the future we were predicting has come true and then some!

Stay curious, stay grounded, and as always—let’s build the future, one pragmatic step at a time.

Questions? Comments?
Comment below or hit us up on social media!

Until next time,
The Next Future Today Team at infocap.ai - the Human-Centric Automation Company

P.S. Put your future into action today—not someday!

Leave a Comment