Welcome to insideAI News’s “Heard on the Street” round-up column! In this regular feature, we highlight thought-leadership commentaries from members of the big data ecosystem. Each edition covers the trends of the day with compelling perspectives that can provide important insights to give you a competitive advantage in the marketplace. We invite submissions with a focus on our favored technology topics areas: big data, data science, machine learning, AI and deep learning. Enjoy!
AI for Good – Computer Vision in health, wildfire prevention, and conservation. Prashant Natarajan, VP of Strategy and Products at H2O.ai
Deep learning and advances in interpretability and explainability are bringing computer vision to life. The advances in deep learning coupled with multi-cloud compute and storage are allowing business & technical teams to create value more rapidly than before. These CV models, scorers, and end-to-end AI pipelines are resulting in higher accuracy, faster experimentation, and quicker paths to outcomes and business value. The development of pre-trained computer vision models is another consequence of these innovations, and augurs well for the ability to predict, scale, and deploy solutions for wildfire prevention & management; wildlife conservation; medical imaging; and disease management. For example, data scientists are able to use satellite imaging, along with a mix of historical and current climatic data; and human habitation trends to predict and manage wildfires. These solutions – at the foundation of which are multi-modal AI and computer vision – are being used by responders, local leaders, businesses, and the community to help save lives, prevent damage to property, and reduce the impact on natural resources. Computer vision is also aiding in wildlife conservation. The vast variety and volumes of selfies and tourist photos – in addition to images from wildlife cameras and tracking reports – are being put to work with completed vision. AI is used to identify individuals; groups; and migration patterns/habits of animals in danger of extinction. Scientists and local communities are deploying these solutions today to better protect habitats and keep a watch on their ecosystem threats. The most impactful uses of computer vision are in health and wellness. Given the challenges with data and the acute realities of pressures faced by clinicians, computer vision on medical images is more needed than prior to COVID. Health systems, pharma companies, and public sector health programs are looking for new outcomes and value from AI – in areas as diverse as abnormality detection, disease progression prediction, and for clinician-supporting tools in precision medicine & population health.
“Shrinkflation” isn’t the Long-Term Answer, Decision Intelligence is. Commentary from Peak’s Go-To-Market Director Ira Dubinsky
With decades-high inflation wreaking havoc on supply chains, brands are hopping on the “shrinkflation” bandwagon – reducing the weight, size or quantity of their product to avoid raising prices. While this is not a new tactic, “shrinkflation” is a sure-fire way to steer customers toward the competition, with 49% of consumers opting for a different brand in response. It’s a short-term tactic for a long-term problem, as “shrinkflation” doesn’t drastically impact transportation routes, packaging or other fixed overhead costs, just the quantities of raw materials per item. That means it doesn’t have the restorative impact on margins that companies are hoping for. Instead, brands should be responding with operational changes. Typically, we see internal data silos dividing demand generation and demand fulfillment data, a lack of visibility across the two is more detrimental to margins than rising production costs. Demand data can be leveraged to improve forecasting and increase the efficiency of fulfillment – it’s a relatively easy short-term modification every brand should be implementing. However, Decision Intelligence (DI), the commercial application of AI to the decision-making process, is a long-term fix that takes this one step further. Brands already have immense amounts of rich data at their fingertips, and DI is capable of helping them uncover who their customers are, where they are, and what they’re buying. This insight can help CPGs in several ways, including reducing waste, eliminating unnecessary costs, and ensuring that the right products are in the right place at the right time. This ultimately allows brands to make incremental efficiencies across a constrained supply chain, keeping customers coming back and putting themselves in a position to combat inflation, without shrinking their products.
Programmers Day. Commentary by Daniel Hutley, The Open Group® Architecture Forum Director
A significant number of businesses is adopting digital into their operations as technology is becoming increasingly ingrained in everyday life. This Programmers’ Day, it’s crucial for businesses to understand the importance of Agile programming teams and not to underestimate the role these teams play for businesses. This is because Agile teams are the key drivers of the enterprise’s digital transformation, through inventing new business models, developing and maintaining software and architecting highly-automated operating systems. While Agile adoption has been growing rapidly over the last two years, and being implemented as a strategic priority at pace communication and implementation of Agile methodologies will need to be rethought and reformulated for operations at scale. Implementing standards such as The Open Group Open Agile Architecture™ (O-AA) Standard is one of the ways businesses can do this, due to formal coordination playing an essential role in ensuring positive holistic outcomes derive from decision making. In this way, programming teams can bring Agile into their operations, increasing the ability to operate efficiently within individual teams, and the wider business. As we progress further into the digital age, and enterprises encourage integration of Agile practices into business and programming operations, the focus for Agile moving forward will be on how enterprises are making Agile part of their businesses DNA, not just something that is practiced by individual teams. Finally, and most importantly, having a standardized Agile business-led approach means that digital enterprises will be more equipped to efficiently work together, from programming teams to wider operations, to embrace new technologies and advance operations, essential for the technology-enabled future.
Conversational AI is the Key to Improved Customer Experience. Commentary by Sanjay Macwan, Chief Information Officer and Chief Information Security Officer, Vonage
The world has undergone a tremendous digital transformation in the past few years, and this will continue to evolve and accelerate as businesses adopt technologies like artificial intelligence (AI) to enhance the customer journey. AI technologies, whether speech-to-text or text-to-speech or deriving context via natural language processing/understanding (NLP/NLU), can be key to driving improved customer communications and enhanced engagement. Often when people think of the term “conversational AI,” they imagine traditional chatbots. While some chatbots leverage AI, not all do. Conversational AI is a technology that powers additional communication solutions beyond just chatbots and turns those transactions and interactions into conversations. It works by combining technologies like NLP and automated speech recognition (ASR) to improve the nature of interactions with customers, to better understand their questions or needs, and address them quickly – automated, contextual, and with a human-like touch. This allows customers to get quicker solutions to problems and frees human agents from the tedious task of continuously answering common questions. Traditionally leveraged in contact centers, Conversational AI can also be used to improve company websites, retail and e-commerce platforms, the ways companies collect and analyze data, and enhance IT support- driving deeper engagement and further improving customer and user experience all around.
In Sales, AI-Assisted Practice Makes Perfect. Commentary by Kevin Beales, VP & GM of Allego
Of the 75% of organizations that provided practice opportunities to employees during the pandemic, 90% found it effective. Artificial intelligence is providing new opportunities for sellers to practice. In sales, like in every industry, greatness isn’t an innate, unattainable skill for the select few. Great sellers are built through proper training—as long as this training is followed up with practice and coaching to make it stick. This is where the rubber meets the road for many sales teams. Managers have large, dispersed teams and lack time to provide the hands-on guidance their reps so desperately need. That’s where technology, such as artificial intelligence (AI), can help. Conversation intelligence (CI), powered by AI, records and analyzes reps’ practice sessions to help pinpoint where improvement is needed. AI tracks against topics and activities of top performers, providing timely feedback to sellers and identifying personalized coaching opportunities for managers. Instead of having to listen to hours of recorded sales conversations, sales managers can consult the data and quickly target moments where a call has gone well—or not. AI has significant potential to improve sales activities and outcomes as it becomes increasingly sophisticated and applicable to new use cases. For example, sales teams can now simulate conversations with AI-powered virtual actors to allow them to practice and receive feedback. AI can also automatically record meeting notes so sales reps don’t miss a single customer interaction. With AI, sales teams can more successfully navigate the modern sales landscape and improve sales readiness.
World Engineer Day. Commentary by Peter Vetere, Senior Software Engineer at Singlestore
The computing field is vast and constantly changing, so it can seem quite intimidating if you’re just starting out. However, programming is all about breaking problems into smaller pieces. So, my advice is to start with a simple goal or problem in mind, even if you’ve never written a line of code in your life. I’ve found that it’s much easier to learn new concepts and technologies if there is a concrete use case to work from, however contrived it might be. Let’s say maybe you want to try printing the numbers 1-10 on the screen. This seems simple, but there is a surprising amount of thought that goes into it. For example, the “screen” can mean different things, depending on the context. Do you want your output to go to a web browser, a command-line window or somewhere else? Do some research on popular programming languages and what they are used for. Python and JavaScript are common first choices. Once you’ve picked a goal and a language to use, find some beginner tutorials on programming in that language. Good tutorials will often teach you enough to do something useful in the first few pages. As you read through them, think about how you can use what you are learning to achieve the goal you set out for yourself. This kind of self-motivated, active curiosity is fundamental to being a successful engineer. Don’t be afraid to search for hints or ask questions in online forums, to engage with teachers or friends, or to join clubs. Given some time, patience and dedication, you’ll eventually accomplish your goal or solve your problem. It’s tremendously satisfying when you do. If you enjoyed the experience, think about some ways you can enhance your program. It will engage your creative mind and will naturally lead you down a path of more learning. After a while, you’ll start to get a feel for what you do and don’t like about programming. You don’t need to know everything – nobody does. Pursue your interests and find the fun in it.
Borderless Data: What Makes for a Sensible Data Regulation Policy. Commentary by Chris McLellan, director of Operations, Data Collaboration Alliance
Think of it as another clash between irresistible force and immovable object: On one side, we have data that’s powering an ever-increasing number of personal and business applications while being copied at scale with little regard for consent forms, data governance policies, or jurisdictional borders; on the other, we have increasingly strict data protection regulations—some to guard personal information, others driven by economic and nationalistic interests. It’s an escalating tension that is putting a severe strain on innovation, compliance, and international commerce. In this maelstrom, neither side will give. . .but both sides should. Turning a blind eye to silos and copies – the root causes of the inability to actually control data and protect its value – is setting the bar really low. The bottom line is that if we want to square the circle of digital innovation and meaningful data protection for citizens and organizations we need to accelerate technologies, protocols, and standards that prioritize the control of data by eliminating silos and copies. For example, the Zero-Copy Integration framework, which is set to become a national and international standard, provides technologists with a framework that is fundamentally rooted in control. There are also ‘zero-copy’ technologies like dataware that enable new digital solutions to be built quickly and without copy-based data integration. As we move forward, we may also need to consider enshrining the control and ownership of data as a human right. But whatever the mix of technological, regulatory, and legal approaches we employ, the control of data through the elimination of silos and copies needs to be recognized by all stakeholders as the only logical starting point.
Satellite boom brings big data challenge. Commentary by Dr. Mike Flaxman (Ph.D., Harvard), Product Manager at HEAVY.AI
Commercial satellite deployments continue to boom. Last year saw a record number of commercial satellites launched into space, with a 44% increase over 2020. These satellite deployments are driving a new wave of data growth. But organizations are now facing an unprecedented challenge in managing and analyzing this unique geospatial data at such incredible scale. Satellites collect massive, complex, heavy datasets. While orgs now have the infrastructure to store and transport this heavy data, they lack a way to reliably analyze it at scale. To do that effectively, they’ll have to harness GPUs and CPUs in parallel to deliver the speed and visualization capabilities needed to map and learn from this data. Satellite geospatial data will support critical use cases – predicting wildfires, measuring climate change, improving 5G services – but organizations will have to find new tech to properly wield it.
IT Professionals Day. Commentary by Carl D’Halluin, chief technology officer at Datadobi
IT Professionals Day is a reminder of the value IT pros provide to virtually every enterprise organization. They do so much to maintain organizations’ infrastructure, and without them, companies would struggle to operate day-to-day. In addition to recognizing their accomplishments on IT Professionals Day, we should also make sure we’re providing them with the necessary tools to make their work easier every day of the week. This includes tools they can use to automate routine tasks and tackle some of the biggest issues facing enterprises today, including cost, reducing carbon footprint, and minimizing business risk. By giving IT professionals purpose-built solutions they can use to maximize their unstructured data and its exponential growth, a huge burden is lifted off their shoulders so they can focus on revenue-generating tasks.
How modern innovations like AI are necessary for the future of NAC. Commentary by Jeff Aaron, VP Enterprise Marketing, Juniper Networks
NAC (Network Access Control), still often a go-to solution to protect enterprise networks, was created at a very different time – before the widespread use of laptops, BYOD solutions and IoT devices. Since then, the number of devices accessing a network has grown exponentially, as has the complexity of deployments. Given their complex nature, cost and lack of operational flexibility, traditional NAC solutions have struggled to keep up with the demands of the modern digital era. As a result, the NAC industry is turning to Artificial Intelligence (AI). AI, when used in concert with a modern cloud architecture, is a natural solution to help improve the deployment and operations of NAC by simplifying and improving processes and unlocking new use cases. With AI-enabled NAC, for example, networks can automatically identify and categorize users based on what it knows about them from past connections and proactively allocate the appropriate resources and policies to optimize user experiences on an ongoing basis. AI can also improve the efficacy of NAC by rapidly analyzing everything from the initial login to a user’s behavior across the network to flag any suspicious behavior. Furthermore, if a device has been compromised, it can be identified immediately with proper actions taken proactively, such as quarantining a user at the edge of the network. This minimizes the impact of exposure substantially. Lastly, AI-enabled NAC can flag devices that are having trouble connecting to the network and take corrective actions before users even know a problem exists. This type of self-healing network operation maximizes user connectivity while minimizing troubleshooting and help desk costs. Current trends have created new demands across networks, particularly with respect to network access control. By integrating the automation and insights that come from AI with the simplicity and scale of the cloud, next generation NAC solutions promise to deliver far more functionality than traditional solutions at lower costs. The need for NAC is as strong as ever, as is the need to transition to new AI-based NAC solutions designed for the modern digital era.
AI and ML Shortcomings in Cybersecurity. Commentary by InsightCyber CEO and founder, Francis Cianfrocca
Some well-known cybersecurity providers have attempted to offer ML and AI solutions, but the ML has been focused on limited vectors. This is a problem – as any security operations center (SOC) analyst will tell you – and this type of AI and ML offer is of limited use. Security professionals’ hands are already tied as they’re still dealing with the challenge of false positive alerts yielded by AI. When the AI activity is restricted, it poses even more of a problem and will not be used effectively. Along with false positives, AI also generates false negatives, which are the goal of advanced persistent threats (ATPs) and what nation-state attackers rely on. So, the human analyst is busy determining what is typical for an environment and what is abnormal to weed out the false positives and negatives and customize alerts to identify real threats early on. Lack of detection is still a significant problem in the world of cyber-attacks, which is true for the most sophisticated types of breaches and for the more traditional methods of attack (i.e., phishing or misrepresentation). That is why now more than ever, AI/ML applications need to be finetuned and enhanced by human experts, resulting in cybersecurity that genuinely works.
How Automated Machine Learning Helps Marketers Market Smarter. Commentary by Steve Zisk, Senior Product Marketing Manager, Redpoint Global
Automated machine learning (AML) has many business use cases, including the ability to educate marketers on how to better understand a customer journey. If you have large amounts of data that cannot simply be calculated, aggregated or averaged, you likely have a very good business use case for AML, especially if the data needs to be scaled. As with other data-intensive activities, ML results will depend on the quality and breadth of the data you feed it, so missing or low-quality input about a customer will result in building poor models. There are effectively two classes of models that marketers should have a basic understanding of in order to be smarter with AML – unsupervised and supervised. For unsupervised models, a user provides data and asks the model to find patterns within that data, making it useful for audience selection or segmentation. Rather than an artificial, manual segmentation based on intuition or arbitrary cut-offs (e.g., age or income), an unsupervised clustering model segments based on what the data dictates is important, interesting or unique about a particular audience. Supervised, on the other hand, uses historical data to build a model that will sort through the variables to find a good predictor for those results – whether it’s customer lifetime value, churn or another measurable behavior or attitude. These models can be built, tested, and optimized by familiar marketing techniques like A/B testing and control audiences. AML’s ability to give marketers a way to ask better questions and make better decisions will ultimately help them understand their customers more. The more they know about a customer, the more they can design relevant messages and offers to improve the customer experience.
Why subscription models still have hope. Commentary by Dr Vinod Vasudevan, CEO of Flytxt
Inflation fears present major challenges for businesses, especially subscription-based models, and maintaining a solid customer base is imperative for not only growth, but survival. More emphasis should be placed on customer lifetime value, and increasing the value of existing customers is a great way to drive growth. For businesses to see this as an important metric to maintain, investing in AI and data-driven analytics will ultimately lead to better measure progress and focus on outcomes.
New School Year – New AI & Machine Learning Tools for Students. Commentary by Brainly’s CTO, Bill Salak
If you work in AI & ML it’s not a surprise when I say there’s considerable lead time needed to build new applications. With that in mind and recognizing that school experience for the 2022 – 2023 school will look very different from 2021 – 2022. I predict we’ll see a frenzy of applications built for last year’s problems being repurposed and reimagined for this year’s problems. What are those problems ? Two of the biggest in the news now are teacher shortages and students entering a new school who are unprepared to move on to the new year of curriculum. In terms of specific applications of AI/ML I predict we’ll see more AI/ML teacher assistance capabilities plugged into Learning Management Systems and Learning Experience Platforms, smarter virtual classroom and tutoring experiences, and lots of imaginative uses of GPT-3, and in general NLP applications pushing boundaries. As we get further along in the school year expect to see more new products being released targeted at helping classrooms do more with less teachers and helping teachers & students close the pandemic induced setbacks on academic achievement and a spectrum of learning gaps across the student population.
Why Asset Intelligence Is Vital to Improve Data-driven Process Automation. Commentary by Arthur Lozinski, CEO and Co-Founder at Oomnitza
Data is often thought of as a commodity. The value lies in the use of data. Take business process automation for IT. In order to streamline processes to realize operational, security, and financial efficiency, organizations require accurate data correlation across endpoints, applications, network infrastructure, and cloud. With increasing amounts of fragmented data collected every day from tens of thousands of technologies, it’s no wonder enterprises are experiencing technology management blind spots. With increasing hybrid workplace, multi-cloud, and digital business growth, a recent survey found that 76% of enterprises are using multiple systems to obtain inventory data and 45% have wasted expenditures on software licenses and cloud services. This is where multi-source asset data correlation comes into play as an essential component for task automation. How can organizations develop effective workflows if they are triggered by siloed IT management tool data that lacks both accuracy and broader operational context? Organizations need to consolidate, normalize, and then process data from their siloed tools such as device management, SaaS, SSO, cloud, healthcare management systems, and purchasing. With a unified system of record for enterprise technology management, organizations will then be able to create and mature workflows with predictable outcomes and greater efficiencies. Applying this to a complex business process such as automated secure offboarding, organizations will be able to streamline tasks based on accurate employee, manager, department, location, resource access, and device data from Separation to Recovery—ensuring complete deprovisioning, workspace transfer, license repurpose, archiving, and asset reclamation. When it comes to key business process automation for IT, asset intelligence is foundational.
Calculating Value of Product Costs. Commentary by Asim Razzaq, co-founder, and CEO of Yotascale
As a former engineer, I’ve seen firsthand how difficult it can be for product teams to determine which products are performing – and which need help. For those who may be struggling, here are a few tips: (i) To calculate accurate profit margins or revenue figures for the products or services being built for end users, cost information should be broken down by product or engineering; (ii) Engineering needs to see cost at the most granular level of operations, where apps are being built and run; (iii) Granular visibility can enable quick identification of cost-inflating anomalies at the level where they are most likely to occur – and can help engineering teams determine which products have the highest value and should receive the most time, money, and resources, (iv) C-suite and engineering teams need to work together to identify the accurate value of product costs and how to not fall victim to simplistic thinking that can blur the truth about a product’s true value.
Sign up for the free insideAI News newsletter.
Join us on Twitter: https://twitter.com/InsideBigData1
Join us on LinkedIn: https://www.linkedin.com/company/insidebigdata/
Join us on Facebook: https://www.facebook.com/insideAI NewsNOW