Showing posts with label management. Show all posts
Showing posts with label management. Show all posts

Tuesday 4 July 2023

Investment on developing AI&ML models – timelines & diminishing return

 

One of the most popular questions that I often get asked by the stakeholders is about the timelines required for a ML model to finish development. I will try to address the subtlety of this topic in this writeup.

AI development is a unique scenario where you are expected to deliver an innovation. It is a special case where the resource required is uncertain. And hence it is sometimes very difficult to understand when & where to “STOP”.

When I talk to businesses one of the questions, what I stress about the most, is for them to define what an “MVP” solution is to them. That is with what minimum accuracy or maximum error rate the AI solution would still be useful for their business.

If you are investing on AI use cases one of the concepts, I would recommend you understand is – AI resourcing & diminishing return. Please look at the graph below –

 



            So, what I suggest to the AI investors are if you haven’t reached an MVP by the point of maximum return, “STOP”. For example – By the end of PoMR if the model is still with an error rate of 30%, and that is something that does not work for your business, may be AI cannot solve this for you. Maybe it needs a completely different approach to solve this. Whatever is the case, deploying more resource is not the solution.

            Driving from my experience with all the AI&ML use cases I have worked for almost a decade now, a general thumb rule which I recommend is – The accuracy or error rate, that you get at the end of 3 months is your Point of maximum return. You should reach an MVP by then. Beyond that it should be fine tuning or customizing to specific business needs. By then if it it’s still miles apart from your business objective, may be its time to pull the plug.

            This is again an opinion piece, and these has been my experience. Will be glad to hear how the journey has been for you.

Monday 19 June 2023

AI, ML, Data Science - frequent consultation advises to leadership

 In this current post I have tried to compile most questions, discussions and queries I come across while consulting Data Science road maps with leaders and managers. I am hoping this compilation will add value to the other leaders and managers too who may have at some point wondered about them but didn’t get the opportunity to have those discussions. Many of you may have a better response or more exposure to some of the questions, I have tried to compile it based on the best knowledge I have and how I go about explaining them.

Please reach out, if you think you have a question or point that appears a lot during consultation and is worth discussed upon.

1. Should we build this product or capability in-house or get it from a vendor?
A vendor product will always be a generalized one to cut across as many businesses as possible as they thrive on repeat-ability. While when you build in-house you can make it more customized to a smaller set of use cases or scenarios and may be create a better differentiation.
So please ask yourself –
· When partnering with a vendor what role do I play? What stops the vendor from partnering with the business directly in future? What is my value addition, is there any risk I may become insignificant.
· What kind of team I have? If you have a great engineering team may be you want to do more stuffs in-house and keep a bigger piece of pie for yourself.
· What is my core capability? Is the capability needed in line with our core skill or is it something we want to learn, then maybe we should do it in house, or it is something we just want to get done, then may be the best way is to get a vendor involved.

2. We have created certain Analytics use cases. But we see several other teams also creating similar use cases.
Differentiation of analytics product or use cases are driven by each and combination of below –
a) Deep domain knowledge
b) Combination of data from different systems brought together on a dynamic big data system
c) Deep or mature algorithm applied
If your use cases are easy to replicate it’s most probably on a shallow data, with very general domain knowledge applied with basic Data Science techniques.

3. Are we using the kind of AI that is used for technologies like Self Driving car?
Yes and No. Internally all these technologies uses combinations of neural net and reinforcement learning. We also for different use cases have used variation of same and similar technologies. But technologies like self-driving car works on image or vision data, which we generally don’t do. Our use cases are mostly based on numerical, text and language processing data.

4. Vendor says their product is being used by Amazon. So should we go ahead and buy it?
May be it is being used by Amazon or any similarly big companies, but ask the vendor if their product is being used for a mission critical process or for some PoC or to store or process data like click-stream data which is not business critical. This makes all the difference, if the logos vendors show you are using the vendors technology for business critical projects or some non-critical processes.

5. We are showing the use case to the business but it’s not making much of an impact.
Story telling needs to be improved. Every analytics use case must be associated with a story that should end with the business making or saving money. If the story cannot relate how the engineering will improve the customer’s financial bottom-line, the customer business does not care about it, irrespective of how good the engineering is.

6. Now we have a data scientist in our team. Can we now expect more insights from our data?
Data Scientists alone cannot ensure project success. Data Engineers, Big Data and Cloud Infra Engineers are equally important part of the technical team. Without the infrastructure in place and data being stored in the infra in proper format, Data Scientists cannot do his or her magic.

7. We are finding very difficult to hire data scientists and big data developers.
Though there is no dearth of CVs, finding genuinely talented people with actual knowledge and production implementation knowledge is difficult. And among the few, most are already paid well and on good projects. So whenever a decision is taken to hire senior data science talents, a 6 month time frame should be kept in hand.

8. What is the difference between ML and AI?
Though you will find several answers to this in the internet one good way I have found to explain it to a business person, without the jargons is as below. By definition ML comes within the broader scope of AI. But to understand better and remember, Ai is something that is built to replicate human behavior. A program is called a successful AI when it can pass a Turing Test. A system is said to pass a Turing test when we cannot differentiate the intelligence coming from a machine and a human. ML is a system which you create to find pattern in a big data set that is too big for a human brain to comprehend. On a lighter note – if you ask a machine 1231*1156 and it answers it in a fraction of a second it is ML and if it pauses, makes some comment and answers after 5 mins, like a human, it is AI.

9. Why aren’t we using a big data Hadoop architecture but using RDBMS like MSSQL. Oracle.
RDBMS products like MSSQL, Oracle are still viable analytics products and are not replaceable by Big Data tools for many scenarios. Deciding on a data store or a data processing engine involves a lot of factors like ACID-BASE properties, type and size of data, current implementation, skill set of the technical team etc. So doing an analytics project does not make Hadoop or NoSQL product default.

10. Here is some data, give me some insight.
This is the first line of any failed initiative. A project which is not clear about the business problem it wants to solve is sure to fail. Starting on an analytics project without a clear goal in mind and for the sake of just adding a data science project to the portfolio and no road-map how this will eventually contribute to company goal, is a waste of resource and will only end in failure.

Sunday 18 June 2023

Competition in the context of the new world order

 Definition of competition has changed. And Elon Musk realized this long back. If you analyze any competitive business now you will somehow find one or few of the big tech 4 – (Facebook, Amazon, Microsoft, Google) involved in some way or the other. And what Elon knew was, his strongest competition will come from Google and Ubers of the world with their self-driving technology, rather than from GM or Ford.

And we know big tech 4 – (Facebook, Amazon, Microsoft, Google) strengths.
· Deep funds available to do experimental innovation. Significantly less pressure to go profitable.
· Army of engineers and scientists.
· Vast computer infra. So vast that a company can rent their unused infra and it can become one of world’s biggest business (read AWS)

So the big question is how do their competitors stay relevant and significant. They should be using what they have gathered over years of being in business i.e. tricks of trade or domain business knowledge. It can be years of building cars, manufacturing things or making software products.

So the strategy can be –
· Avoid direct competition with the tech giants. If the product is too generic in nature they will build it faster and better with their deep resources.
· Integrate data science closely with business products. Analytics should be out of the box and intuitive. If required collaborate with the tech giants’ offerings but never give away domain expertise.
· Enable data science teams with domain knowledge. Celebrate people who are domain experts and make them part of the data science team.