top of page

Your Weekly Dose of Data Clarity: Issue 20

Monday, July 8, 2024

ย 
Photo of man with arms crossed
Dave Findlay CEO Fuse Data

This week's edition


Happy Monday all!


As the temperatures continue to skyrocket across the country so do the data conversations that I've been watching!


In this issue, we'll be diving into:


๐Ÿค– What do Citadel and Nintendo have in common?


๐Ÿ’ฐ What to consider, besides total cost, when evaluating new data tools.


๐Ÿฆนโ€โ™‚๏ธ Shot's fired! Are the data heroes at your organization really villains in disguise?


๐Ÿ˜„ Your Weekly Dose of Data Hilarity.


I hope you enjoy the issue!


Best regards,

Dave


ย 

๐Ÿค– What do Citadel and Nintendo have in common?


It's really hard to think of any similarities between the US hedge fund and the Japanese game and device maker, however, there is one - they are both a bit cool on AI. While most companies are riding the AI hype train, executives at Citadel and Nintendo seem to be a bit more measured.


This post from technology consultant Dr. Jeffery Funk highlights excerpts from a Fortune article where Citadel founder and CEO Ken Griffin describes the current limits of LLM and other AI models and believes the talent that Citadel is hiring is better equipped to deal with the realities of the business world. Griffin notes that models do well when there is consistency in the underlying data, however, regimes change, and the real world is full of inconsistencies. Something that human intuition is much better equipped to deal with today.


Nintendo's take is similar in some ways, as they are also betting on their own talent and don't really see value in leveraging generative AI for game making. A post by Amir Satvat highlights Nintendo President Shuntaro Furukawa's thoughts on the matter:


We have decades of know-how in creating optimal gaming experiences for our customers, and while we remain flexible in responding to technological developments, we hope to continue to deliver value that is unique to us and cannot be achieved through technology alone.

These two executives are certainly pouring a bit of cold water on all the AI hype, but it is welcome in my opinion. Bad (and costly) decisions are made when technological advances are riding the hype curve. Level-headed executives need to be able to get ahead of this to protect budgets and time investments from their teams and companies.


All of that, however, may be coming to an end. The most recent Gartner hype cycle for AI suggests that the hype on generative AI has peaked and is descending into the trough of disillusionment. This is not a bad thing at all. It's in this trough that true value is built. This is where companies can really dig in, build ROI positive solutions, and help pull this innovation out of the trough and into the plateau of productivity.


ย 

๐Ÿ’ฐ What to consider, besides total cost, when evaluating new data tools.


When evaluating new tools, most companies will only look at the sticker price when developing their business case, but there are many other factors that should be considered as well.


That's why I was happy to see a post last week from Saks VP of Data, Veronika Durgin, that laid out a comprehensive list of factors depending on your decision to either "build" or "buy" the tools.


The only item I would add to Veronika's list would be the cultural fit of the selection. This is, in some ways, covered by the training and onboarding; however, it's a slightly different flavour. Admittedly, it's a bit quantitative in nature, but it can have a significant impact on the cost of getting things done.


For example, if you're a digital native enterprise with a strong engineering culture, you may not find a fit by implementing a low-code platform like Informatica or Talend as your data integration and management solution. Even after training, the technology philosophy may not be accepted by your team... and nobody wants a grumpy data engineering team. ๐Ÿคฃ


ย 

๐Ÿฆนโ€โ™‚๏ธ Shot's fired! Are the data heroes at your organization really villains in disguise?


Last week, Nicholas Mann, CEO of Stratus Consulting, decided to take on the entire data engineering community in his post that leads off with:

Data engineers who insist on custom coding everything only care about one thing.ย Themselves.ย 

The point that Nicholas is trying to make is that leaders need to be mindful of team members who may be believers in the old "job security through obscurity" philosophy.


Cowboy coding practices and the like can, on the surface, appear to solve problems, but in reality, they create technical debt and single points of failure. Nicholas's post is a very good reminder to be vigilant for this in your organization, even if he did take some heat in the comments.


In my experience, I've seen a few data villains, but most of the "cowboy coding" was done with the best intentions. Usually it comes as a result of 1) underinvestment in the data program and 2) pressure from leadership to "just get the data". These two factors, when put to a resourceful individual, will only lead to one outcome. Proper investment in your data program and skilled leadership will most certainly mitigate the issues that Nicholas has put forward.


ย 

๐Ÿ˜„ Your Weekly Dose of Data Hilarity


.... let me Google that for you.

Brought to you by Consulting Comedy.




ย 

Want this delivered to your inbox?





bottom of page