Why is remote software development expensive?
While we are speaking to our prospective clients about their product development plans, one comparison that often gets discussed is why they should even outsource to us and not hire an in-house team?While there are many other factors in comparing remote vs in-house software development, one of the important factors in this comparison is the overall cost.Though seasoned entrepreneurs may understand this very well, as a first-time entrepreneur, it’s easy to miss out that when you are hiring a company for your software development, you’re paying for a finished product.
And a “finished product” is a lot more than the code that gets pushed on your servers. Here’s what all goes into building successful software:
Team salaries: The salaries paid to the team working on your product is the most obvious expense and hence this is the first item on this list. Duh!
Office space: A clean, well maintained IT office at an accessible location with all amenities including Internet connectivity, meeting rooms, projectors, whiteboards, pantry, etc. is a constantly recurring cost borne by the company.
Hardware and software licenses: The employee’s laptops and networking equipment, both of which require routine upgrades as per market standards is an important recurring cost an IT company has to bear. In addition, many software licenses and SaaS tools and services (ex. email services, testing servers, OS licenses, etc.) have to be paid for on a month to month basis.
Admin overheads: Admin and maintenance staff and office supplies needed for day to day office operations may seem like small expenses in isolation but quickly start to add up.
CA and Accounts: Any software company will need an accounting and CA services to handle their routine compliances, taxes, and payrolls.
Hiring and training costs: Hiring and training people is a stressful, expensive and time-consuming task. Not only is the hiring process difficult, but every organization also has to deal with the problem of employee churn. Every IT company also needs to spend considerable time in training and upgrading employee skills as per current trends and market expectations.
Team activities: A happy and satisfied team is much more than just offering market-rate salaries. Today’s workforce looks for perks, team activities, etc. and as an employer, these are important team building and retention strategies.
Operating profit: Over and above these costs, the software company will keep a profitability margin which will make the business sustainable over the long run.
If you are starting up, it may be a better idea to simply outsource all of the above to a software company so that you have only one thing to focus all your attention on – Your product
The Rise of AI in Art: Ushering a New Era of Machines with Creative Behaviors Just a few years ago, AI seemed like some futuristic tech straight out of a Sci-Fi movie. But the tables have turned now. We probably experience AI-based tech more often than we think.We come across at least one instance of AI in our daily life – be it a product recommendation algorithm on an online shopping platform or text auto-correction in our smart phones.Recent developments in AI, have begun to question the very characteristic of human nature which makes us unique, i.e., creativity. AI is already creating a myriad of visual art, poetry, music, and likewise, which bear an eerie resemblance to real human art.The Creative Leap of AICan we really imagine intelligent machines rivaling human creativity? Somehow, it’s still difficult to imagine a mathematical algorithm to be creative, isn’t it? But, not anymore.What we’re witnessing is that AI can be creative, even artistic. Recognizing and sorting images is one thing, but how about creating those from scratch?
Intelligent AI systems are now capable of creating artwork using certain algorithms. Much like in the creative world, where there is no set of rules to be followed, similarly to create AI art there are no particular rules. Thousands of images are analyzed and then the algorithm generates a new image. In the same way as we accessorize our paintings and those get better, AI too includes stylistic processes to generate images. More importantly, AI art does not replicate what humans do; rather it replicates the actual human thought process and enhances human creativity – a process called as “co-creativity”.
Interestingly, Creative adversarial networks (CANs) – a set of machine learning frameworks – maximize deviation from established styles. Human artists are directing the code with a desired visual outcome in mind and some really interesting artwork is being made. Quite artsy, isn’t it?
Collaboration is the key
Since AI has officially entered the world of art, it seems we’re getting overwhelmed and been thinking AI to be a threat to the art world. Here’s the catch – AI works best with human collaboration. Basically, AI uses algorithms that are fed to it. The more you feed, the better and easier it gets. Seemingly, AI works best when there’s an amalgamation between human creativity and modern machines. It’s not the technology alone that makes the difference but rather the knowledge and creativity of humans. It means AI is never going to replace humans in art as there is a requirement for real creativity, the real human emotions.
Impact of AI algorithms in Art: Overcoming the Limitations of Human Creativity
Creativity is attainable. But, human creativity has its limitations. This is where AI comes to your rescue and solves your problems. AI emulates and enhances our creative thought process in art and business as well. AI art or art created with neural networks has recently surged up with being hotshot “AI artists” on the rise. With an algorithm named AICAN, a solo exhibition was held in New York having each of its portraits sold for $6000 to $18,000.
Another example is Unsecured Futures, a solo exhibition to showcase artwork – drawing, painting, sculpture and video art – by Ai-Da, the first ultra-realistic humanoid robot artist. The brainchild of Gallery Director Aidan Meller, Ai-Da is capable of using her eyes and pencil for drawing people from life using AI-based algorithms. This exhibition actually questions the human relationship with technology but interestingly, it was a grand success that earned Ai-Da around 1 million pounds worth of artwork.
Applications of AI have found their way into the music industry too. AI-based algorithms are being used by musicians in their live performances, studio production in the form of various plug-ins and software. Moreover, some of the current AI technologies have successfully composed entire songs.
The best example would be Bach’s music, which integrates math into its music following a structured pattern, and can be easily replicated by AI. Facebook AI Research (FAIR), whose research team has created high-fidelity music with neural network is another such example.
The song “Drowned in the sun”, written by AI Magenta and launched by Google is also a work of art. Google’s latest Poem Portraits is another such example of how far the field of AI has come in the past few years. New AI tool – Deep Nostalgia – animates the faces in old photos to make them look alive.
Recently, GPT3 – the third generation of the language predicting deep learning algorithm took the world by storm by generating some of the most human-like conversations such as poems, stories, articles, etc.
Could AI be the Future of Art?
Humans have been raising the bar from drawing machines to generating arts using AI. Needless to say, AI has transformed our society and has changed the way we interact with technology. Though its impact upon us is greater, still there will always be negative consequences associated with it. But it would be hasty on our part to predict that AI will take over our life.
When GPT3 was announced to the world, it received a mixed response; one of utter amazement as well as deep concern. The age of the Industrial revolution witnessed machines replacing humans as a better alternative.
Now, this raises the question – will we be replaced by AI algorithms in the same manner? Are the algorithms the better alternatives? And if so, will it be for the greater good of mankind? These are some genuine reasons for concern.
Algorithms are a product of the human thought process. AI is not as artificial as we might deem it to be. AI algorithms merely implement our thought processes on a computer. Hence, all that an AI can create is a product of collaboration among humans. We feed in the data that other humans have generated.
Art by AI algorithms is a reflection of the global creativity of mankind. They are an ideal representation of what we, as individuals, have put into the world.
On this World Art Day, let’s rejoice in the artistic results of a synergistic collaboration between humans and intelligent machine systems.
The COVID‐19 pandemic that hit us last year brought a massive cultural shift, causing millions of people across the world to switch to remote work environments overnight and use various collaboration tools and business applications to overcome communication barriers.
However, this generates humongous amounts of data in audio format. Converting this data to text format provides a massive opportunity for businesses to distill meaningful insights.
One of the essential steps for an in-depth analysis of voice data is ‘Key Concept Extraction,’ which determines the business calls’ main topics. Once the identification is accurately completed, it leads to many downstream applications.
One way to extract key concepts is to use Topic Modelling, which is an unsupervised machine learning technique that clusters words into topics by detecting patterns and recurring words. However, it cannot guarantee precise results and may present many transcription errors when converting audio to text.
Let’s glance at the existing toolkits that can be used for topic modelling.
Some Selected Topic Modelling (TM) Toolkits
Stanford TMT : It is designed to help social scientists or researchers analyze massive datasets with a significant textual component and monitor word usage.
VISTopic : It is a hierarchical visual analytics system for analyzing extensive text collections using hierarchical latent tree models.
MALLET : It is a Java-based package that includes sophisticated tools for document classification, NLP, TM, information extraction, and clustering for analyzing large amounts of unlabelled text.
FiveFilters : It is a free software solution that builds a list of the most relevant terms from any given text in JSON format.
Gensim : It is an open-source TM toolkit implemented in Python that leverages unstructured digital texts, data streams, and incremental algorithms to extract semantic topics from documents automatically.
Our AI CoE team has developed a custom solution for key concept extraction that addresses the challenges we discussed above. The whole pipeline can be broken down into four stages, which follow the “high recall to high precision” system design using a combination of rules and state-of-the-art language models like BERT.
Pipeline:
1) Phrase extraction : The pipeline starts with basic text pre-processing, eliminating redundancies, lowercasing texts, and so on. Next, use specific rules to extract meaningful phrases from the texts.
2) Noise removal: This stage of the pipeline uses the above-extracted phrases to remove noisy phrases based on signals mentioned below:
Named Entity Recognition (NER): Certain NER such as quantity, time, and location type that are most likely to be noise for the given task are dropped from the set of phrases.
Stop-words: Dynamically generated list of stop words and phrases obtained from casual talk removal [refer to the first blog of the series for details regarding casual talk removal (CTR) module] are used to identify noisy phrases.
IDF: IDF values of phrases are used to remove common recurring phrases, which are part of the usual greetings in an audio call.
3) Phrase normalization: After removing the noise, the pipeline proceeds to combine semantically and syntactically similar phrases. To learn phrase embedding, the module uses state-of-the-art BERT language model and domain trained word embeddings. For example, “Price Efficiency Across Enterprise” and “Business-Venture Cost Optimization” will be clubbed together by this pipeline as they essentially mean the same.
4) Phrase ranking: This is the last and final stage of the pipeline, which ranks the final set of phrases using various metadata such as frequency, number of similar phrases, and linguistic POS patterns. These metadata signals are not comprehensive, and other signals may be added based on any additional data present.
Natural Language Intent Recognition: Intelligent Audio Transcript Analytics Using Semantic Analysis to Understand User’s Intent
In the modern business landscape, timing is everything. Quickly identifying user’s intent can help you get a leg up your competition. How? It can enable you to respond actively to a potential customer’s interest and multiply your chances of influencing the key decision-makers through meaningful conversations.
But, if you receive thousands of customer interactions a day, detecting customer intent in your unstructured data is challenging. The good news is that you can automate intent classification with artificial intelligence, so you can identify intent in thousands of emails, social media posts, and more in real-time and prioritize responses to potential customers.
Raise your hand if you’re a business that’s finding it increasingly complex to detect user intent from voluminous unstructured data sets containing long-wielded sentences and juxtaposed multiple objectives. Chances are your hand is up.
The good news is that you now have a solution to this.
What is Intent?
Simply put, refer to anything a user wants to accomplish
Now talking from a technical perspective, we define intent as a single or group of 2-3 contiguous sentences that can solely convey an idea with its necessary context. Extracting the call intent can lead to many downstream applications, such as better content creation and planning.
3 Challenges in Natural Language Intent Recognition
We discussed some challenges in part-1 of this 4-blog series. Here we discuss three more challenges specific to intent recognition (or intent classification).
There can be multiple intents present across the call transcript like we discussed in the example above.
Differentiation between the client intent and details of the same. In the above example, differentiating between the intent, i.e., to know about the growth percentage or its details.
Missing or incorrect punctuations leading to wrong sentences extracted as questions. For example, “I’m not sure what the report says?” having a question as wrong punctuation.
Cleaning and casual talk removal steps, mentioned in Part 1 of this 4-blog series, are followed to remove the unwanted sentences present in the transcripts. This important step highly affects the output of the next steps.
For instance, “How are you Cathy? How was your vacation?” should not be extracted by the Question Analytics module, which we will explore later in this blog. The intents are present throughout the call; however, we observed that 92% of the time, the intent was in the first half of the call. Hence, we focused on it to increase the precision of the system.
Feature extraction
Natural Language Question Extraction: To extract questions that clients ask, we use Anteelo Question Analytics NLP Accelerator that follows a hybrid approach ( combination of both rule-based and supervised approach). The rule-based approach leverages 5W-1H words and four generalized POS tag and Dependency parser patterns to detect the starting point of the interrogative part, if available, in a sentence. The supervised classifier was trained on ~100k questions’ data.
Constraints-based Intent Sentence Extraction: Identify the objectives of the client that are not conveyed in the form of questions. Intent identification is done by skip-gram matching of two generalized Dependency parser patterns.
Contiguity Sentence Extraction: We also extract the important sentences after the first extraction step to provide more context. The following sentences were extracted if they are tightly coupled with the preceding sentences identified by the conjunctions and other identifiers
Intent Segments Formation
Primarily, the intent segments are formed combining the contiguous sentences extracted by the methods stated above. However, simply combining the contiguous sentences can lead to many sentences in a segment that would decrease the system’s effectiveness.
The system divides the obtained segment into subsets with the least deviation in the number of sentences in each subset segment and with a maximum of 6 sentences in a subset segment. The splitting is done considering the continuity and similarity of sentences. These split segments are considered as final intent segments that will be fed into the next module.
Natural Language Intent Ranking
This module will rank the intent segments obtained from the above module. We use multiple signals to rank these segments.
Topics: Used topics obtained from Key Concept Extraction mentioned in Part 2 of this blog series to boost the segments’ scores containing these concepts.
Number of questions: Improve the score of the segments having a high number of questions.
Importance of paragraph: Giving higher weightage to the segments in the bigger paragraph, having vital information.
Summarization: Boost the score of the segments having TextRank + Bert summary sentences.
More signals can be added to domain-specific needs.
Dynamic number of Output Intents
Since there is no fixed number of intents that the clients ask in a call, providing a hard cut-off of Top “N” intents will not provide desirable output. Hence, the system is designed to automatically provide the dynamic number of intents corresponding to each transcript using a differential cut-off to identify the number of intents that needs to be provided as output.
The radical transformation in how people across the world are living during the Coronavirus pandemic is having a significant impact on internet businesses. While some are seeing sales plummet, others are struggling to cope with growing demand. In this post, we’ll look at how the online marketplace is changing in the current circumstances.
1. Growing demand for streaming services
Millions of people are turning to movies and box-set series to keep them entertained while they are cooped up indoors. As a result, streaming services are seeing growth not just in the amount of time people are watching but in the numbers of new customers flocking to use their services. In Europe, Netflix has had to reduce its picture quality by 25% to ensure bandwidth capacity.
Increased demand means that in North America, Netflix is now forecast to more than double expected growth in new subscriptions, from 1.6% to 3.8% over the year – and that’s in a region where it is already well established. Internationally, growth is expected to rise by over 30%.
It’s not just Netflix that is benefitting. So too are other streaming services, like Amazon Prime Video, Hulu and Now TV. Recently launched services like Disney and the BBC-ITV venture, Britbox, which may have struggled to compete, might find opportunities that wouldn’t have arisen in normal circumstances.
2. Online gaming taking off
Although a narrower market, younger people forced to stay at home are driving up demand for gaming. This isn’t just increasing subscriptions for online gaming services but also helping retailers of downloadable PC games. PC gaming platform, Steam, for example, has seen its highest number of users in 16 years with traffic spikes of over 20 million at times.
3. Big impact on PPC ad spending
The travel industry has been one of the most affected sectors by the virus and this has resulted in a slump in advertising from travel-related businesses, with some market experts suggesting it could lead to 15 – 20% reduction in travel advertising revenue for Google and Facebook. This figure is likely to be compounded by all the other businesses that rely on tourism also cutting their ad spend.
It is not just travel-related businesses who are reducing advertising. With many companies forced to close due to the effects of social distancing, they too will be cutting back or suspending advertising altogether. In 2018, McDonalds spent over a billion dollars in advertising just in the US. It has now closed all its UK stores and is shutting thousands of others globally as the pandemic spreads. It obviously won’t be damaging its cashflow by spending huge amounts on ads over this period. With industries such as entertainment, high street retail, restaurants, etc., also affected, Google and Facebook could see ad revenue fall by up to 45% over the next few quarters.
However, it is not all bad news. With fewer advertisers competing for ads, the cost per click in many sectors is likely to reduce, meaning those companies that can still derive value from advertising will see their budgets go further. In addition, consumers are clicking on more ads associated with employment, education, hobbies, leisure, arts and entertainment.
4. Holiday bookings won’t dry up
While travel is out of the question for most people at the moment, more than half of those who take frequent holidays are likely to book trips further into the future. Business travellers are even more likely to make long term bookings. While this is not the immediate relief those in the travel industry and all the depended industries need, the taking of deposits can help with current cashflow problems. Most of these bookings will take place over the internet.
As the pandemic begins to recede, it is predicted that most holidaymakers will, initially, seek domestic holidays where there is likely to be less disruption impact by failing tour operators and airlines and where the impact of the virus is more certain than abroad.
5. Global increase in online shopping
As fewer people go out, their shopping habits are moving online. Even retailers seeing a boom in sales, like supermarkets, are having more customers using their delivery service simply to avoid the risk of going to the store.
This rise is happening globally. An Ipso-Mori study found that 18% of UK consumers were shopping more online. In countries which have been more badly affected, the numbers of people increasing their internet shopping is even more substantial: 31% in Italy and 51% in China. However, the biggest increases are in countries like India 55% and Vietnam 57%. This rise has meant some companies are struggling to cope with demand. Amazon, for example, is so busy it is recruiting 100,000 additional staff, raising wages and making its employees work overtime to meet demand.
One area of particular growth is in the use of grocery apps, which are seeing unprecedented numbers of downloads in the US. Instacart downloads during March are already more than triple that of February while Walmart’s app has seen a 160% rise.
Conclusion
Coronavirus is having a significant impact on consumer behaviour and this is affecting internet businesses in different ways. For many, there are challenging times ahead as consumers drop plans to travel and stop online bookings for local businesses. However, there has been a sharp increase in online shopping with some retailers having to expand their workforces to cope.
Before singing praises on the wonders that MLOps can do, let me shine some lights on a few new learnings, thanks to the post-pandemic crisis, that the companies across the globe have learned, especially the CPG.
Digital channels, or at least, digitization is a requisite. It is like Yoda said – do or don’t, there is no try! CPG companies who have toiled for years to see their brand sprout across the market witnessed a sharp decline in sales in a matter of months! Logistics became a big problem, yes, but their poorly implemented strategies were the actual Gordon Knot.
Today, consumers have a plethora of options. CPG firms cannot rely on their standard go-to-market strategies. How to connect with end-consumers? Now, there is an addendum to the question – how to connect with end-consumers and win them?
Companies across the world, irrespective of the size and market presence, have started moving from offline to online, in one or another way – Who does not think and act ‘online’ is up for a loss.
Health and wellness have become essential factors for the customers.
Millennials shop online; nothing drives them more except the cost to value. They want convenience, a sense of belonging, and too at lower prices.
Well, these are just the picture’s skeleton, the actual painting factors in multiple new developments, such as:
The emergence of small and medium-sized companies, focusing on target customers.
Manufacturers and distributors share data to streamline the logistics.
A surge in the usage of automated systems.
Shift towards local consumption.
E-Logistics companies collaborating with the retail stores.
The list is long.
A quick glimpse of how a product reaches the end consumer.
If you start eagle eyeing each step, you will find tremendous opportunities hidden in them.
Here are a few.
Opportunity 1 – Introduce a forecasting functionality based on new data. Opportunity 2 – Bring in an integrated system that synchronizes the data across the process. Opportunity 3 – Factor in self-learning feature that would comprise the market changes, customers’ buying behavior, etc.
You can cash on the above opportunities by implementing automation systems with various machine learning (ML) algorithms. You can introduce ML algorithms, such as:
Route optimization to make the best of the sales reps’ time.
Product optimization to solve the product mix problems.
NLP to analyze the consumers’ behavior.
Trade promotion optimization to plan and execute your trade spends.
Again, this list is endless.
So, you have the solution – build ML models and deploy them. What are the critical roadblocks in adopting Machine Learning?
Problem 1 – Continuous delivery of value
Your team who works on the use case and writes the ML codes do not deploy them. Or at least, they do not have expertise on the delivery. So, relying your success entirely on the data science team can frustrate them and derail your ML journey.
Problem 2 – Composite and complex ML builds
Unlike traditional development builds, ML models make predictions by (indirectly) capturing data patterns without following the explicit rules. The ML build runs a pipeline that extracts patterns from the data to create model artifacts, making it far too complex and experimental.
Problem 3 – Productionizing ML models
Gartner figures 80% of the data science projects fail or never make it to production. To run the project successfully in a real-time environment, you need to find the problem situation and solve the problem when it occurs. You need to continuously monitor the process to find the difference between correct and incorrect predictions (bias) and know in advance how your training data will represent real-time data.
Areas to Focus: Identify Where Things Might Go Wrong for You
Beyond ML deployment difficulties and risks in the CPG, there are several other key areas where things can go wrong, so instead:
Find out the exact use case; if you try solving the wrong problems, things will go wrong.
Do not build models that do not map well to your business processes.
Check if you have any flawed assumptions about the data.
Convert the results of your experimentation into a production-ready model.
There are opportunities, there are problems, and there are ML models. However, the only requirement that delays the models’ deployments or often triggers performance issues is simply the lack of means to deploy it successfully. Anteelo can reduce your effort in solving the ML deployment challenges through its state-of-the-art ML Works platform that provides you the means to run thousands of ML models at scale and at once.
The global market in 2021 is faster, more digital, and more competitive than ever. Customers carry the baton, and demand signals keep flowing into the enterprise from more and more different channels.
The modern supply chain dynamics require innovative capabilities and strategies to deal with uncertainties, improve resilience and implement holistic solutions to balance costs, services, deliveries, and customer expectations.
Four Challenges Facing the Modern Supply Chain Industry
Supply chain leaders need to manage a highly complex supply chain for the global business environment and deal with disruptions to keep the bottom line and top line intact. However, for decades, poor supply chain visibility has suffocated the industry.
Here are the four challenges gripping the modern supply chain.
1. Data and Application Silos
Vertical organizations often fly blind.
Yes, that is true. Most companies are vertically integrated and use systems such as ERP, TMS, WMS, MRP to manage their functional departments. The functions primarily rely on plans developed within such systems to drive execution, monitoring & control. As a result, critical information such as customer demand, logistics, function-specific supply challenges & backlogs is siloed and invisible to other departments.
While function-specific analysis is time-consuming, cross-functional insights are even more challenging and require sifting through large volumes of data. Thus, business unit heads lose sight of the strategic ambitions of the overall supply chain
According to a survey by Supply Chain Dive, only 6% of companies believe that they have achieved complete supply chain visibility.
The lack of supply chain visibility is overwhelming and keeps on staggering.
2. Lack of Know-Hows, Tools, Technologies to Generate Insights
With the advent of digital data, volume, accessibility, and insights generation through analytics are critical to creating a sustainable supply chain.
However, because analytics is not widely adopted, the data is poorly used.
The data engineering and analytics capabilities in most supply chains are insufficient. As a result, supply chain leaders often cannot effectively use relevant data at the required speed. They also lack diagnostic and advanced analytics tools/technologies and often fail to understand the nature of use cases or problems in the supply chain.
3. Lack of Predictive and Prescriptive Capabilities
Digitalization is not enough.
As per the Chartered Institute of Procurement & Supply Risk Index’s report, the average annual economic loss caused by major natural disasters around the world is approximately US$211 billion.
Supply chain leaders also need to leverage new capabilities to predict market moods, deviation, and unanticipated geopolitical landscape.
However, most existing advanced analytics applications cater to solving point problems. There is an acute shortage of capabilities to use prescriptive or simulative simulations or what-if analysis to investigate broader issues in the supply chain and make recommendations. In addition, there are only a few good AI/ML-driven analytics solutions out there that prevent executives from using machine learning and limit the automation of the supply chain.
4. Lack of Off-the-shelf Solutions
Every use case or nature of the problem varies from customer to customer. So off-the-shelf products cannot meet customization and personalization requirements. Regarding the KPIs that businesses want to measure, use cases vary from company to company, making it impossible for off-the-shelf applications to handle. Such rigid solutions put the burden on supply chain leaders to get data in the desired format.
Indulgent customizations, choice complexities often lead to value destruction.
Need for an End-to-end Supply Chain Visibility Capabilities through Digital Control Tower
“Gartner reports, 79% of supply chain leaders believe that the internet/platform-based approach is the most critical new business model.”
The above four challenges require building a digital control tower with data engineering functions and pipelines on top of a solid data layer. Establishing a simplified data architecture with an automated framework can integrate master data and transactional data sources in a streamlined manner, ensuring the availability of necessary data across multiple silos to obtain accurate real-time visualization of the overall supply chain health.
AI/ML-driven analytics and rapid scenario planning can provide speed, consistency, and flexibility to achieve controllable and manageable supply chain functions, thereby helping executives gain a competitive advantage.
Two Critical Elements for an Ideal Supply Chain Control Tower
An ideal Supply Chain Control Tower (SCCT) is a cross-departmental, system-integrated “information hub” that provides end-to-end visibility.
There are two key elements to build/implement an ideal SCCT.
1. Real-time Visualization Catering to Different Personas.
Executive Insights: An ideal supply chain control tower will provide a bird’s eye view of the overall supply chain health. It will enable the leaders to collect and distribute information, identify risks, and respond strategically.
Execution Insights: SCCT’s state-of-the-art setup caters to the nuanced aspects of the supply chain health for multiple execution persona – analysts or managers at the DC level or fulfillment center to view the various KPIs. It provides them with information to monitor, measure, and manage different aspects of the supply chain, including transportation, inventory movement, and operational activities.
2. Use Case Approach for Autonomous Supply Chain
The ideal supply chain control tower can guide leaders/managers to explore potential use cases. It will allow them to find the most critical challenges that profoundly impact the overall performance of the supply chain and use advanced analytics, such as machine learning, advanced forecasting, or advanced scenario planning. In this way, they can combine use cases with visualization and diagnostic capabilities and automate the supply chain as they mature.
Conclusion: Control Towers are Stepping Stones Towards Autonomous Supply Chain
The supply chain control tower provides complete visibility from high-level monitoring layers to execution details, so the executives can optimize, manage, plan and execute supply chain processes and operations faster and more accurately. The addition of anomaly detection, automated root cause analysis, and response capabilities will further simplify the transition towards a cognitive supply chain control tower.
As businesses gather and store ever greater quantities of data, managing it becomes increasingly challenging. To get the maximum value from it, it needs to be easily accessed and compiled so that it can be analysed. However, when it is stored in separate silos across numerous departments, this is hard to achieve. The solution that many companies are opting for in order to overcome these issues is data warehousing. In this post, we’ll look at the pros and cons of setting up a data warehouse.
What is a data warehouse?
A data warehouse is a centralised storage space used by companies to securely house all their data. As such, it becomes a core resource from which the company can easily find and analyse the datasets it needs to generate timely reports and gain the meaningful insights needed to make important business decisions.
The pros of data warehousing
The growing popularity of data warehousing is down to the benefits it provides business. Key, here, is that a unified data storage solution enhances decision making, enabling businesses to perform better in the marketplace and thus improve their bottom line. As a data warehouse also means data can be analysed faster, another advantage is that it puts the company in a better position to react to opportunities and threats that come their way.
With the entire array of the company’s data available to them, data managers can make more accurate market forecasts and do so quicker, helping them implement data-driven strategies swiftly and before their competitors. The accuracy of market forecasts is improved due to the warehouse’s ability to store huge amounts of historical data that can highlight patterns in market trends and shifting consumer behaviours over time.
Data warehousing can also help companies reduce expenditure by enabling them to make more cost-effective decisions, whether that’s in procurement, operations, logistics, communications or marketing. It can also massively improve the customer experience, with end to end customer journey mapping helping the company personalise product recommendations, issue timely and relevant communications, deliver better quality customer service and much more.
The cons of data warehousing
While the centralised storage of data brings many benefits, it does have some drawbacks that companies need to consider. For example, with such vast amounts of data in one place, finding and compiling the datasets needed for analyses can take time. However, not as long as would be needed if they were all kept in different silos.
Another potential issue is that when data is stored centrally, all the company’s data queries have to go through the warehouse. If the company’s system lacks the resources to deal with so many queries, this can slow down the speed at which data is processed. However, using a scalable cloud solution for data warehousing, where additional resources, charged on a pay per use basis, can be added as and when needed, eradicates this issue.
For many companies, the biggest obstacle for setting up a data warehouse is the cost. When undertaken in-house, there is often significant capital expenditure required for the purchase of hardware and software, together with the overheads of running the infrastructure. Additionally, there are ongoing staffing costs for experienced IT professionals. Again, the solution comes in the form of managed cloud services, like Infrastructure as a Service (IaaS), where the hardware and operating systems are provided without the need for capital expenditure and where software licencing can be significantly less expensive. What’s more, the service provider manages the infrastructure on your behalf, reducing staffing requirements. Even where specialised IT knowledge is required in-house, such as with integrating different systems, the 24/7 technical support from your provider will be there to offer expertise when needed.
Conclusion
Any company undergoing the process of digital transformation needs to consider the benefits of data warehousing. The centralised storage of all the company’s data is essential for companies that wish to integrate their existing business processes with today’s advanced digital technologies. Doing this means you can fully benefit from big data analytics, artificial intelligence and machine learning, and all the crucial insights they offer to drive the company forward.
Setting up a data warehouse in-house, however, presents several major challenges. There is significant capital expenditure required at the outset, together with on-going overheads. In addition, integrating a diverse set of company systems so that data can be centralised is not without its technical challenges. By opting for a cloud solution, however, cap-ex is removed, costs are lowered and many of the technical challenges are managed on your behalf.
Changes in the data distribution are monitored with Data Drift, one of the most common indicators when monitoring MLOps models. It is a metric that measures the change in distribution between two data sets. Before diving deeper into it, let us examine how ML Works defines drift for a time series use case and how the different drift components provide valuable insights and recommendations.
In Illustration 1 below, we can see that distributions of the light blue and dark blue samples (training and test data sets, respectively) are different for the same bin definitions of a feature in the model. This difference in the distribution is what drift quantifies as a percentage of shift.
Illustration 1: Distributions of the training (light blue bars) and the test data (dark blue bars).
Data Drift in Time Series Models
Let’s consider a Promotion Effectiveness Model as an example with four variables:
Total Promotion Spends
Promotion Duration
Product’s Base Price
Product’s Promoted Price
These variables drive product sales every month, and data drift is measured at the three major aspects of a time series model, i.e.,
Feature Drift
Target Drift
Lag Drift
Feature Drift
In Feature Drift, each variable in the training data is compared with the new stream of data that the model uses to make the prediction. The importance of each feature’s variables and Feature Drift (ex: Promotion Duration) can give an idea of the data problems you need to address as a part of model degradation.
Note: Feature-level insights are applicable to all types of machine learning model.
Target Drift
Target Drift plays an important role in further understanding data issues. It measures how predictions in the new data stream have a different distribution than the trained model’s target variable. Therefore, Target drift indicates how extreme the model predictions can be/are compared to the trained data.
Note: If Target Drift exists despite Feature-level Drift, one can assert that model is under-fitted, and the relationship between the Features(X) and Target(Y) is not robust to making predictions, or that the model is over-fitted to outliers, etc. (The reasons are not exhaustive to the assumptions made above).
Therefore, it is recommended to investigate the model training process and increase the quantity/quality of data entering the model (improve correlation, feature transformations, better stratification, etc.)
Lag Drift
In time series models, auto-correlation is likely to affect the final prediction of the model. Hence, to identify a data pattern change in the lag components, Lag Drift (A direct comparison of the training and test data frame of the model’s lag components) was introduced.
Note: If there is no Feature Drift or Target Drift, but there is Lag Drift, retraining the model with a better data sample is recommended for accurate sales prediction.
Some of the metrics elucidated above can help you set up the capability to monitor the health and degradation of production models and determine the data handling/modeling changes required to implement and sustain ML solutions and automation.
Illustration 2: Functional Flow of the First Step of Automating the ML Solution.
Based on our many years of consulting experience, we have built an enterprise-grade MLOps product called ML Works to address the problems mentioned above and enable ML solutions to take the first step in the MLOps journey.
With the rise of more and more MLOps platforms, the business world is moving towards an inevitable transformation. Today, big players like Google, Microsoft, and Amazon have begun to monetize this space.
As Anteelo’s next-gen industrialized MLOps, ML Works can reduce your Data Scientists’ efforts and lead your organization towards faster and frugal innovations.