Robustcloud https://robustcloud.com/ Mon, 09 Dec 2024 10:43:04 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.1 https://robustcloud.com/wp-content/uploads/2021/11/robustcloud-favicon.png Robustcloud https://robustcloud.com/ 32 32 Oracle expands its multicloud partnership with Microsoft and goes live with its limited preview of Oracle Database@AWS https://robustcloud.com/oracle-multicloud-partnership/ Fri, 06 Dec 2024 15:52:17 +0000 https://robustcloud.com/?p=4235 Oracle positions itself as a vendor with unique multicloud capabilities, giving customers direct access to Oracle Database services running on Oracle Cloud Infrastructure (OCI) and deployed in Oracle, Amazon Web Services (AWS), Google Cloud, and Microsoft Azure data centers. At Oracle CloudWorld in 2024, Oracle announced its partnership with AWS, supporting the third infrastructure vendor […]

The post Oracle expands its multicloud partnership with Microsoft and goes live with its limited preview of Oracle Database@AWS appeared first on Robustcloud.

]]>

Oracle positions itself as a vendor with unique multicloud capabilities, giving customers direct access to Oracle Database services running on Oracle Cloud Infrastructure (OCI) and deployed in Oracle, Amazon Web Services (AWS), Google Cloud, and Microsoft Azure data centers. At Oracle CloudWorld in 2024, Oracle announced its partnership with AWS, supporting the third infrastructure vendor after Google Cloud and Microsoft Azure. At Microsoft Ignite in November 2024, Oracle grew its relationship with Microsoft Azure, expanding the availability of Oracle Database@Azure to new regions while adding new features and capabilities (See Figure 1 below). This blog post covers Oracle’s multicloud strategy while exploring its expanded partnership with Microsoft Azure.

Figure 1: A Single Stack Use Case with Microsoft Azure. Source: Oracle

Oracle’s Multicloud Strategy:

1. AWS:

·       Oracle Database@AWS will allow organizations to access Oracle Database services directly within AWS data centers, including Oracle Autonomous Database and Oracle Exadata Database Service. This offering provides a low latency network connection between Oracle databases and applications on AWS, making it easy for customers to connect enterprise data in their Oracle databases to applications running on Amazon EC2, Amazon Analytics services, or AWS’ AI and machine learning services, including Amazon Bedrock.

·       Other benefits include native console integration, zero-ETL integration for analytics, and unified customer support, which empowers businesses to innovate while maintaining operational agility.

·       At AWS re:Invent, Oracle announced that Oracle Database@AWS is available in limited preview for select customers to use Oracle Exadata Database Service in the AWS US East (N. Virginia) Region to begin migrating and deploying their enterprise workloads to the cloud. Oracle will operate and manage the Oracle Exadata Database service, and Oracle Database@AWS is slated to have broader availability in the coming months.

2. Google Cloud:

·       Oracle Database@Google Cloud provides native access to Oracle Database services in Google Cloud data centers, combining Oracle’s database expertise with Google Cloud’s AI and analytics capabilities.

·       Customers can leverage Oracle Autonomous Database, Oracle Exadata Database Service, and Oracle Database Autonomous Recovery Service alongside tools like Google Vertex AI to build AI-driven applications and generate deeper analytical insights within a unified operating environment.

·       Oracle Database@Google Cloud is available in four Google Cloud regions: N. Virginia, Salt Lake City, Frankfurt, and London, and will expand further in the coming months across North America, Europe, the Middle East, Africa, Asia Pacific, and Latin America.

3. Microsoft Azure:

·       Oracle Database@Azure gives organizations the power and versatility to run Oracle Exadata Database Service and Oracle Autonomous Database natively within Azure’s data centers. This enables access to Azure’s global network, enterprise-grade AI, and analytics tools. The offering runs on OCI within Azure data centers and offers low-latency connectivity and a seamless management experience.

·       Oracle Database@Azure is now generally available in ten regions worldwide, with an additional 23 regions planned by the end of 2025. Current regional availability includes Australia East, Brazil South, Canada Central, East US, France Central, Germany West Central, Italy North, US West (DR), UK South, and UK West (DR).

·       Deeper network integration with Azure Key Vault for customer managed keys over Private Link and expanded Terraform support allow customers to choose the Terraform style that best aligns with their infrastructure-as-code (IaC) approach.

Key Advantages of Oracle’s Multicloud Strategy

Below are a few highlights of Oracle’s multicloud advantages.

·       Simplified Cloud Migration: Oracle’s offerings are designed to reduce complexity, with tools like Oracle Zero Downtime Migration and support for Bring Your Own License (BYOL).

·       Global Availability: Oracle has expanded its services across multiple regions, ensuring businesses worldwide can leverage its technologies for consistent, reliable performance. This is especially important for customers with a global presence to meet regulatory requirements unique to each geography.

·       Unified Management and Support: Partnerships prioritize a seamless experience with integrated dashboards, unified billing, and comprehensive support from Oracle and its partners. This gives customers a better and simpler cloud experience, freeing resources for other initiatives.

·       Improved Innovation: By combining advanced database solutions from Oracle with the most optimum tools from other cloud providers, Oracle facilitates faster insights and application modernization, enabling enterprises to remain competitive. With this capability, customers can keep existing investments while retaining the ability to leverage many capabilities from the most appropriate cloud service provider.

Oracle and Microsoft Partnership Specifics

The collaboration between Microsoft and Oracle, initially defined by Oracle’s multicloud ambition, has significantly evolved in 2024. With both tech giants continuously expanding Oracle Database@Azure, the initiative provides joint customers with new ways to manage and deploy critical workloads across multicloud environments.

Microsoft and Oracle are leaders in cloud innovation, each excelling in distinct domains. Microsoft Azure, renowned for its extensive global network and robust enterprise integrations, brings a vast portfolio of AI, analytics, and developer tools. On the other hand, Oracle’s expertise lies in its flagship database solutions, mainly Oracle Database and Exadata infrastructure, which are well known for their scalability, reliability, performance, and security.

Oracle Database@Azure is a solution that combines the best of both worlds, bringing together the synergy of both organizations. The service runs on OCI within Azure data centers and offers low latency connectivity, security, and unified management. Its compatibility with Azure’s development and AI tools provides an easy-to-use environment for enterprise-grade applications.

The Value of the Partnership

Oracle and Microsoft announced momentum updates at Oracle CloudWorld and Microsoft Ignite, underscoring the growing customer demand for Oracle Database@Azure. Key developments included:

1. Expansion of Availability: Oracle Database@Azure is now live in ten regions, including new expansions to Brazil South and Italy North. This aligns with the planned footprint of 23 additional regions by 2025, expanding Oracle’s geographical footprint and offering Oracle Database@Azure to more global customers.

2. Enhanced Integrations: The Oracle GoldenGate integration with Microsoft Fabric now supports the public preview of Open Mirroring, helping ensure data is always analytics ready.

3. Multicloud Accessibility: Flexible pricing models and support for “Bring Your Own License” (BYOL) programs reduce barriers for organizations transitioning to multicloud, further cementing Oracle and Microsoft’s partnership in addressing enterprise cloud migration challenges.

Customer Success Stories

The true testament to the partnership’s success lies in its customer impact. The following organizations across various sectors have embraced Oracle Database@Azure for transformative outcomes:

1. The Craneware Group: This healthcare solutions provider chose Oracle Database@Azure to power its pharmacy SaaS applications, which serve over 12,000 healthcare institutions. The service’s reliability and security played a vital role in meeting the critical demands of healthcare compliance and patient services.

2. Vodafone: Oracle Database@Azure facilitated a seamless multicloud strategy to unify European database operations. This enabled cost-effective, high-performance service delivery while enhancing operational flexibility.

3. MSCI: Leveraged Oracle Exadata Database Service to bolster business continuity and resilience in financial analytics. The company reported improved safeguards for its mission-critical operations, ensuring reliability amid growing global demands.

4. Fonterra: New Zealand’s largest company turned to Oracle Database@Azure to integrate its existing Azure and OCI environments. This enhanced the efficiency of its global dairy export operations and provided a foundation for future growth.

Conclusion

Oracle’s customers span various industries, including leading banks, healthcare organizations, independent software vendors, and telecommunications companies. Oracle’s approach redefines how businesses interact with the cloud, delivering a robust multicloud ecosystem that empowers customers with flexibility and choice for their most critical workloads. Oracle allows customers to drive continuous innovation by seamlessly integrating its products with AWS, Google Cloud, and Microsoft Azure services.

The partnership between Microsoft and Oracle characterizes how collaboration can drive innovation in the cloud industry. By bridging their strengths, the two companies have created a multicloud ecosystem that prioritizes customer success, operational efficiency, and technological innovation. The references above illustrate that customers strongly believe in the value provided by the joint offering. This partnership sets a new standard for how cloud providers can collaborate to address the diverse needs of modern businesses.

The post Oracle expands its multicloud partnership with Microsoft and goes live with its limited preview of Oracle Database@AWS appeared first on Robustcloud.

]]>
WebAssembly Unleashed: Revolutionizing Safety-Critical Embedded Systems https://robustcloud.com/webassembly-unleashed/ Thu, 21 Nov 2024 20:27:55 +0000 https://robustcloud.com/?p=4225 Introduction The rapid advancement of embedded systems, especially those that are safety-critical and integral to cyber-physical systems, presents challenges and opportunities for the WebAssembly (Wasm) community. In a recent panel discussion at Linux Foundation’s WasmCon held in Salt Lake City held on November 11th and 12th, 2024, titled “Safety-Critical Meets Web-Native: WebAssembly (Wasm) Revolutionizes Embedded […]

The post WebAssembly Unleashed: Revolutionizing Safety-Critical Embedded Systems appeared first on Robustcloud.

]]>

Introduction

The rapid advancement of embedded systems, especially those that are safety-critical and integral to cyber-physical systems, presents challenges and opportunities for the WebAssembly (Wasm) community. In a recent panel discussion at Linux Foundation’s WasmCon held in Salt Lake City held on November 11th and 12th, 2024, titled “Safety-Critical Meets Web-Native: WebAssembly (Wasm) Revolutionizes Embedded Systems,” subject matter experts explored how Wasm is shaping the future of embedded systems across various industries, including automotive, industrial automation, and consumer electronics. The session delved into critical areas such as resource constraints, real-time performance, safety and reliability, security, and standards-based interoperability, offering valuable insights into Wasm and revolutionizing the embedded systems landscape. The write-up categorizes the essential subjects discussed at the session with observations from the following panelists and moderator (image below):

Moderator:

Divya Mohan, Open-Source Leader (Community), SUSE

Panelists:

Stephen Berard, Chief Technology Officer, Atym Larry Carvalho, Principal Consultant, RobustCloud Dan Mihai Dumitriu, Director and CTO, midokura, a Sony Group company Chris Woods, Senior Key Expert (SME), Siemens

Figure 1: Image of the screen during the session

Benefits of Wasm in Resource-Constrained Environments

Embedded systems often operate under strict resource limitations, making efficiency a critical concern. Wasm offers significant benefits in such environments:

  • Language Flexibility: Traditionally, embedded software development has been confined to low-level languages like C and C++, posing a barrier to developers proficient in other languages. Chris from Siemens highlighted that Wasm allows developers to use a variety of languages, bringing new skills and problem-solving approaches to embedded devices.
  • Small Runtime Footprint: Stephen from Atym pointed out that Wasm runtimes like the Wasm Micro Runtime (WAMR) are incredibly lightweight, with footprints as small as 50KB. This minimal overhead is crucial for devices that cannot support entire operating systems, virtual machines, or containers. There are other ways to reduce this event further with projects such as wasm2c.
  • Code Portability and Modularity: Wasm enables the creation of portable and modular code, reducing development friction and facilitating more accessible updates and deployments in embedded systems.

Real-Time Performance in Cyber-Physical Systems

Real-time performance is essential for the functionality of cyber-physical systems. The panelists discussed how Wasm addresses real-time requirements:

  • Predictable Performance: Chris noted that WAMR meets many real-time use cases due to its predictable and efficient performance on hardware, with minimal jitter.
  • Ahead-of-Time Compilation: Dan from Sony emphasized that ahead-of-time (AOT) compilation can enhance performance, sometimes even outperforming native GCC-compiled code.
  • Avoidance of Garbage Collection (GC): Wasm’s ability to ensure memory safety without relying on garbage collection is crucial for real-time systems, as GC can introduce latency and unpredictability.

Enhancing Safety and Reliability in Embedded Systems

Safety and reliability are paramount in industries like automotive and industrial automation:

  • Safety-Critical Opportunities: Chris highlighted the potential for a “safety-certified Wasm runtime,” which could decouple software from hardware certifications, lowering development costs and enabling mixed-criticality solutions where non-safe software runs safely in a sandboxed environment.
  • Standardization and Certification: Formal specifications like the Spec-Tec DSL for Wasm can aid in automated proofs and validations, streamlining the certification process for safety-critical applications.

Improving Security Posture with Wasm

Security is a significant concern, especially with the increasing connectivity of devices:

  • Sandboxing Capabilities: Wasm’s sandboxed execution model restricts code interaction to explicitly allowed operations, enhancing security by preventing unauthorized access to system resources.
  • Fine-Grained Control: Stephen mentioned that developers can limit runtime environment capabilities based on application needs, such as restricting network access if not required.
  • Response to Emerging Threats: Larry emphasized the importance of security in the context of increasing cyber-attacks, particularly for electrical grids and industrial applications.

Aligning Wasm Roadmap with Embedded Systems Needs

The panelists offered feedback to the Wasm community to better align its development with industry needs:

  • Support for Legacy Software: Chris stressed the importance of supporting legacy POSIX-compliant software to ease adoption in the embedded space.
  • Lightweight Runtimes: Keeping the runtime lightweight and simplifying function calls are crucial for resource-constrained devices.
  • Continued Engagement: Ongoing collaboration through groups like the Embedded Special Interest Group (SIG) can help address technical challenges and align development efforts.

Importance of Standards-Based Interoperability

Standardization is vital for widespread adoption:

  • Creating a Thriving Ecosystem: Chris noted that standards lay the groundwork for ecosystems where partners and competitors can invest and evolve technology over decades.
  • Industry Collaboration: Companies like Siemens, Atym, and Sony contribute to standardization efforts through participation in the W3C ecosystem, the Embedded SIG, and research collaborations.
  • Balancing Innovation and Standards: Dan highlighted the need to balance rapid deployment with adherence to standards to ensure progress and interoperability.

Innovative Use Cases of Wasm

The panelists shared how their organizations leverage Wasm:

  • Containerization for Embedded Devices: Stephen discussed Atym’s development of a containerization solution using Wasm to enable OCI-like application containers on embedded devices with minimal footprints.
  • Signal Processing in Cyber-Physical Systems: Dan mentioned Sony’s use of Wasm for sandboxed signal processing, particularly outputs from deep neural networks.
  • Addressing Technical Challenges: These implementations face challenges like integrating runtimes with host systems, creating secure APIs, and ensuring safety, which the organizations are collaboratively addressing.

Collaboration Among Industry Leaders

Collaboration accelerates the advancement of Wasm in embedded systems:

  • Embedded SIG Participation: Companies are working together in the Embedded SIG to tackle technical problems, share efforts, and communicate effectively with the broader Wasm ecosystem.
  • Open-Source Contributions: Projects like Atym’s Ocre runtime are released as open-source under the LF Edge project, inviting community contributions and fostering innovation. Project link: https://lfedge.org/projects/ocre/

Future of Wasm in Embedded Systems

Looking ahead, the panelists envision:

  • Safety-Critical Certifications: Achieving a safety-certified Wasm runtime would be a significant milestone, enabling broader adoption in safety-critical applications.
  • Attracting New Talent: Wasm’s language flexibility may attract developers who are not traditional embedded systems programmers, enriching the talent pool.
  • Hardware Competition: Hardware platforms may compete aggressively to support Wasm, potentially leading to domain-specific accelerations.

Advice for Developers and Stakeholders

For those considering adopting Wasm in embedded projects:

  • Viability and Adoption: Chris assured Wasm is a viable platform already successfully deployed on millions of devices by major companies.
  • Getting Started: Developers are encouraged to download a runtime like WAMR, begin their projects, and engage with the community through forums and SIGs.
  • Understanding the Ecosystem: While discussions around new features may seem complex, they are part of improving an already stable and widely adopted platform.

Conclusion

The panel discussion underscored Wasm’s transformative potential in the embedded systems domain. By addressing resource constraints, enhancing real-time performance, improving safety and security, and fostering standardization, Wasm is poised to revolutionize the industrial sector with automation. Collaborative efforts among industry leaders and the Wasm community are crucial in overcoming challenges and shaping the future roadmap. Developers and stakeholders are encouraged to embrace Wasm, contribute to its evolution, and leverage its capabilities to build the next generation of embedded and cyber-physical systems. The future of embedded systems is not just about hardware innovation but also about adopting flexible, secure, and efficient software platforms like Wasm.

The post WebAssembly Unleashed: Revolutionizing Safety-Critical Embedded Systems appeared first on Robustcloud.

]]>
Streamlining Operations with Workday Extend: A Customer’s Story https://robustcloud.com/streamlining-operations-with-workday-extend/ Thu, 03 Oct 2024 12:29:09 +0000 https://robustcloud.com/?p=4202 Workday Rising is Workday’s annual conference, where customers, partners, and industry leaders gather to learn about the latest innovations, network with peers, and explore how Workday’s solutions can drive their organizations forward. Workday Extend is a platform (see Figure 1 below) that enables organizations to develop, deploy, and manage custom applications seamlessly integrated with Workday’s […]

The post Streamlining Operations with Workday Extend: A Customer’s Story appeared first on Robustcloud.

]]>

Workday Rising is Workday’s annual conference, where customers, partners, and industry leaders gather to learn about the latest innovations, network with peers, and explore how Workday’s solutions can drive their organizations forward. Workday Extend is a platform (see Figure 1 below) that enables organizations to develop, deploy, and manage custom applications seamlessly integrated with Workday’s core human capital management and financial systems. RobustCloud met with Ed Duda of Cushman and Wakefield at Workday Rising in Las Vegas to discuss the organizations’ experience using Workday Extend.

Large organizations face operational hurdles in today’s dynamic business landscape. Cushman & Wakefield, a leader in commercial real estate services, is an example. With over 52,000 employees across 400 offices, Cushman & Wakefield was challenged to streamline processes and improve accountability. This blog post describes how Cushman & Wakefield addressed the challenges by leveraging Workday Extend, followed by why they chose the product and advice for other customers.

Figure 1: Workday Extend Architecture (Source: Workday)

Challenges Faced

Improving Ease of Doing Business

Recognizing the need for a streamlined approach, Cushman & Wakefield embarked on a journey to simplify its business processes. This was not just a matter of choice but a necessity to meet the stringent regulatory requirements of a publicly traded company. With a sprawling workforce operating from numerous locations, standardizing operations became crucial to enhancing efficiency and accountability.

Navigating Complex Systems

Employees faced significant challenges when finding and navigating the systems intended to align the global organization with Workday. The absence of a unified system created obstacles for teams, making it hard for them to collaborate effectively and access the necessary information promptly.

Aligning Diverse Service Lines

Cushman & Wakefield operates multiple service lines, including property management and facility services. The organization faced challenges due to non-linear and siloed operations, with each division functioning independently. This decentralized structure made aligning the entire organization in a unified direction difficult.

Leveraging Workday Extend

Connecting Siloed Businesses

Cushman & Wakefield faced a significant challenge integrating their separate business units and tailoring Workday to fit their specific requirements. By utilizing Workday Extend, they created personalized solutions, allowing Cushman & Wakefield to seamlessly link different systems and processes.

Enhancing Labor Billing Accuracy

One critical use case was the inaccuracy in recording labor billing and assigning proper project codes. With millions of transactions annually, errors led to incorrect invoices and significant time spent on corrections. By developing a Workday Extend application, Cushman & Wakefield targeted reducing the time and effort required to rectify these errors, ensuring accurate financial statements and compliance with regulatory reporting requirements.

Choosing Workday Extend

Addressing Operational Inefficiencies

Cushman & Wakefield’s day-to-day activities depended on allocating workers to different projects, each assigned a code according to the level of demand. The frequent shuffling of labor led to errors in the coding, which in turn required tickets to rectify the invoicing mistakes. To address this issue, a specialized team of twelve contractors devoted 45 minutes per ticket to resolve these issues. C&W implemented Workday Extend to speed up this process.

Partnering for Success

To navigate the nuances of Workday Extend during the design, build, and deployment phases, Cushman & Wakefield partnered with Intercrowd, a systems integration expert. Despite integration challenges, this collaboration accelerated the deployment process, implementing the labor billing correction solution within six months.

Achieving Results

Post-deployment, Cushman & Wakefield witnessed remarkable improvements within 45 days. The contractor headcount was reduced from twelve to five, saving the company a million dollars annually. The average time to handle a ticket decreased from 45 minutes to four minutes and eventually down to two minutes, showcasing the efficiency of the Workday Extend solution.

Advice for Workday Users

With the experience of Cushman & Wakefield’s journey with Workday Extend, Ed Duda offered the following valuable lessons for other organizations considering similar solutions:

  1. Focus on Small Wins: Ed advises against overhauling everything at once and instead recommends focusing on specific areas that deliver immediate value.
  2. Leverage Expert Partnerships: For organizations new to Workday Extend, the best practice is to collaborate with experienced partners to navigate the initial complexities effectively.
  3. Develop Internal Expertise: To build long-term capabilities, organizations should invest in developing internal skills to manage ongoing projects independently.
  4. Embrace Continuous Learning: Workday resources allow users to gain hands-on experience to maximize the platform’s potential. This is a valuable way to build an internal Workday development practice within an organization.

Conclusion

Cushman & Wakefield’s experience demonstrates that customized solutions can deliver value to complex organizations. By addressing unique operational challenges, improving efficiency, and encouraging collaboration, Cushman & Wakefield gained an immediate return on investment. The Workday Extend project contributed to savings while setting the stage for sustained progress and compliance. Cushman & Wakefield’s focus on targeted improvements, leveraging partnerships, and building internal capabilities is an excellent example of leveraging Workday Extend.

The post Streamlining Operations with Workday Extend: A Customer’s Story appeared first on Robustcloud.

]]>
AWS Healthcare and Life Sciences Summit: Transforming Patient Care Through Data-Driven Insights and Responsible AI https://robustcloud.com/aws-hcls-summit/ Fri, 14 Jun 2024 16:29:11 +0000 https://robustcloud.com/?p=4105 Introduction: AWS recently hosted its first summit focused on the healthcare and life sciences (HCLS) industry, inviting a select group of industry analysts. The summit highlighted the importance of understanding the unique value delivered by AWS to address the specific needs of the healthcare and life sciences industry. AWS positioned itself as a transformative force […]

The post AWS Healthcare and Life Sciences Summit: Transforming Patient Care Through Data-Driven Insights and Responsible AI appeared first on Robustcloud.

]]>

Introduction: AWS recently hosted its first summit focused on the healthcare and life sciences (HCLS) industry, inviting a select group of industry analysts. The summit highlighted the importance of understanding the unique value delivered by AWS to address the specific needs of the healthcare and life sciences industry. AWS positioned itself as a transformative force in the HCLS industry by showcasing how customers can apply AWS services to this domain. High-quality, accurate, relevant, and timely data is critical for innovation in the AI age. The company’s central theme was enabling innovation through data-driven insights (see Figure 1 below). Connected to AI solutions, the other two themes were advancing generative AI for improved patient care and ensuring responsible AI deployment. This blog post delves into these themes, shedding light on AWS’s industry initiatives and their potential impact on the healthcare and life sciences sectors.

Figure 1: AWS Data Foundation for AI in Clinical Workflow (Source: AWS)

Enabling Innovation Through Data-Driven Insights

AWS’s healthcare and life sciences strategy is rooted in the belief that data is the cornerstone of innovation. One set of AWS efforts focuses on unifying and harnessing vast amounts of healthcare data to drive improved patient outcomes and operational efficiencies. Below are some ways of delivering data-driven benefits:

1. Unified Data Platforms:

Every organization has various data available from internal and external sources in varying formats. AWS offers solutions for breaking down data silos, allowing seamless integration and analysis of diverse data sources. For example, AWS HealthLake aggregates structured and unstructured health data, empowering healthcare providers to gain actionable insights. By converting data into a unified, searchable format, AWS enables healthcare organizations to harness the potential of their data, fostering innovation in patient care and research.

2. Enhanced Patient and Provider Experiences:

Patient experience in healthcare settings is based on a standard protocol without discerning differences in each patient. Physicians could deliver much better care with insight into all the data available in near real-time. Within a clinical workflow, AWS has the data management elements to support the delivery of actionable insights to the provider while improving the healthcare experience for patients. This infrastructure meets the operational needs of healthcare systems by focusing on individual patients. Unified data facilitates improved outcomes, lowers costs, and provides innovative therapeutic solutions.

For this use case, one success story shared at the summit was the partnership between Philips and AWS. Both companies expanded their collaboration to enhance digital pathology through cloud technology. This partnership combines Philips’ expertise in clinical workflows with AWS’s cloud solutions to improve diagnostic capabilities and productivity. With the security and scalability of cloud technology, pathology labs can efficiently manage and analyze large volumes of digital pathology data, integrate AI for better insights, and facilitate seamless collaboration among specialists. The goal is to improve patient care by optimizing workflow efficiency and enabling more precise and timely diagnoses.

3. Research and Development in Life Sciences:

The amount of data available from past research is increasingly challenging the possibility of a new drug being easily discovered to address specific health issues. Made-for-purpose silicon accelerates the analysis of data to help scientists process large amounts of data, giving them better insight while developing new drugs. Generative AI speeds up research and discovery processes by helping to optimize clinical trial protocols, predict patient outcomes, and design proteins. This enhances the efficiency and effectiveness of research initiatives. AWS’s infrastructure, including custom silicon, supports continuous experimentation and integration of third-party generative AI solutions, fostering a culture of innovation.

AWS also offers data capabilities in life sciences, covering genomics, clinical trials, and drug discovery. Omics data encompasses comprehensive datasets from various biological levels, such as genomes, proteomes, and metabolomes, providing insights into genetic, protein, and metabolic functions within biological systems. Services like AWS Health Omics transform genomic and other omics data into valuable insights, speeding up research and development. This is especially important in precision medicine, where data-driven approaches can result in more effective and targeted treatments.

Advancing Generative AI for Enhanced Patient Care

Generative AI significantly advances healthcare innovation, and AWS strives to lead this transformation. The company’s generative AI stack, a unique amalgamation of Amazon Bedrock and diverse foundation models, offers unique capabilities for the HCLS industry.

Here are some critical characteristics of AWS’s Gen AI capabilities:

1. Diverse AI Models and Customization:

With the advent of Gen AI, many models have emerged, each with its strengths and weaknesses. This variety allows customers to select models that align with their business needs. AWS offers diverse AI models through Amazon Bedrock, including Anthropic’s Claude, Meta’s Llama, and Amazon’s Titan, all of which can be customized using organizational data. This adaptability empowers healthcare providers to choose and customize models most effective for clinical decision support, patient engagement, or operational tasks, enhancing patient care.

2. Clinical and Operational Applications:

Currently, Gen AI is mainly used to automate repetitive tasks. The healthcare industry is burdened with many regulations, and healthcare workers face many repetitive tasks required for compliance. As a result, Gen AI applications have a wide range of uses in healthcare. For example, AWS HealthScribe can automatically generate clinical notes from patient-clinician conversations, significantly reducing the administrative burden on healthcare professionals.

Ensuring Responsible AI Deployment

As AI becomes increasingly integral to healthcare and life sciences, AWS emphasizes the importance of responsible AI deployment. This commitment is reflected in the company’s framework for ensuring ethical and secure AI usage. AWS shared the following aspects of responsible AI deployment:

1. Ethical AI Practices:

Recent lapses in AI deployments that caused racist and derogatory chatbot responses have brought ethical issues to the top of customer priorities. AWS prioritizes the ethical use of AI by incorporating mechanisms to monitor and steer AI system behavior. The approach includes evaluating and mitigating potential biases, ensuring the accuracy and fairness of AI outputs, and protecting sensitive data. For example, Amazon Bedrock includes guardrails to filter harmful content and block undesirable topics, ensuring partner or customer AI applications adhere to ethical standards.

2. Compliance and Security:

Some AI companies are facing lawsuits as they use data owned by other entities without due compensation. The risk has made customers cautious about their data becoming publicly available when using publicly available AI solutions.  AWS offers solutions for regulatory requirements such as HIPAA, GDPR, and SOC to address the confidentiality of healthcare data. The company’s focus on security and privacy ensures that healthcare organizations can trust AWS to provide secure infrastructure to protect their data, knowing that industry-leading safeguards protect it. AWS’s standard compliance underscores its commitment to providing safe, trustworthy AI solutions.

3. Customer Empowerment:

Public AI models shot up in popularity with consumer-facing solutions but lacked transparency on the data used for training. Many hiccups are faced during AI solution deployment. One example is the Air Canada customer chatbot, which provided incorrect guidance to customers. As a result, customers embarking on AI solution deployment expect visibility on the end-to-end processes. AWS enables customers to make informed decisions about their engagement with AI systems. The visibility involves transparency around AI model evaluations and ensuring organizations can customize models to fit their unique requirements. By doing so, AWS fosters a collaborative environment where stakeholders can actively participate in developing and deploying AI solutions.

Conclusion

The AWS Healthcare and Life Sciences Summit underscored the company’s role in driving innovation with industry partners and customers through its comprehensive cloud solutions. By focusing on data-driven insights, AWS aims to break down data silos and enable healthcare organizations to harness the full potential of their data, leading to improved patient outcomes and operational efficiencies. The advancements in generative AI showcased by AWS, including diverse AI models and applications in clinical settings, highlight the transformative potential of AI in enhancing patient care. Moreover, AWS’s commitment to responsible AI deployment ensures that ethical considerations and robust security measures are integral to its solutions, fostering customer trust and reliability.

As the healthcare and life sciences sectors evolve, AWS’s strategic initiatives and innovative technologies position it as a critical enabler of progress. Industry-appropriate messaging by AWS will improve HCLS customers’ ability to map technology value to business outcomes. The summit highlighted AWS’s current capabilities and provided a glimpse into the future of healthcare, where data and AI play central roles in driving meaningful change.

The post AWS Healthcare and Life Sciences Summit: Transforming Patient Care Through Data-Driven Insights and Responsible AI appeared first on Robustcloud.

]]>
Oracle Unveils OCI Strategy: Rapid Data Center Growth, Advanced AI Infrastructure, and Versatile Cloud Solutions https://robustcloud.com/oracle-unveils-oci-strategy/ Thu, 06 Jun 2024 21:35:03 +0000 https://robustcloud.com/?p=4098 Introduction: At the Seattle summit in May 2024, Oracle’s senior executives briefed a select group of industry analysts on OCI’s strategy. The presentation covered key areas such as infrastructure, sales differentiators, customer success, and a customer panel discussion. The focus was on OCI’s unique high security and privacy selling points combined with a distributed cloud […]

The post Oracle Unveils OCI Strategy: Rapid Data Center Growth, Advanced AI Infrastructure, and Versatile Cloud Solutions appeared first on Robustcloud.

]]>

Introduction: At the Seattle summit in May 2024, Oracle’s senior executives briefed a select group of industry analysts on OCI’s strategy. The presentation covered key areas such as infrastructure, sales differentiators, customer success, and a customer panel discussion. The focus was on OCI’s unique high security and privacy selling points combined with a distributed cloud strategy. OCI services are widely used in various deployments, from public and sovereign cloud to cloud at customer, dedicated regions, and Oracle Alloy. While the OCI strategy caters to diverse customer needs, this report will emphasize the significant focus areas.

Figure 1: OCI distributed cloud offerings (Source: Oracle)

Building data centers at speed: Clay Magourk, EVP of OCI, shared the OCI growth across revenue and customers, requiring additional data centers. Mahesh Thiagarajan, EVP of OCI Core Infrastructure, spoke about Oracle’s strategic and rapid data center construction. The accelerated capability comes from process improvements in tasks required to design and deploy a data center into production, ranging from time to generate data hall diagrams to validating GPU clusters. With vendors playing a significant role in any data center construction effort, supply chain optimization for faster delivery is essential. Oracle stressed the importance of building more inventory with new strategic reserves for future expansion plans, instilling confidence in the audience about OCI’s growth strategy.

While data center construction may seem straightforward, optimizing the process is crucial for an organization preparing for growth. Oracle highlighted the increasing power demand, expected to triple from 2024 to 2029. The company’s strategy of planning locations based on power availability is a critical factor in its ability to expand among leading public cloud providers.

Delivering AI capabilities:

The demand for AI solutions is growing, and hyperscale cloud vendors are racing to grow infrastructure to meet customer requirements. As a result, capital spending by cloud providers is increasing as demand is outpacing the available AI infrastructure capacity. Oracle shared the core AI infrastructure differentiators of OCI as:

  1. Speed: Delivering large GPU deployments in time.
  2. Performance: Scaling for large-scale Gen AI training and sustained model floating point utilization.
  3. Health: Enabling customers to finish multi-week or multi-month training at a higher than 95% rate.
  4. Price: Lower cost per port.

One of the innovations shared was having bare metal compute available, which enables isolation and security as it runs only a specific customer workload. This capability is vital to AI customers who pay close attention to safeguarding individual data. Another capability was defining the size and shape of GPU memory, which supports small language models to large language models in the 100B+ parameter AI model.

Oracle’s AI infrastructure is versatile for customers. It can scale and support compute powered by Intel and NVIDIA, with AMD support planned in the future.

Choices for OCI Services:

Users have varied needs when using cloud services. Data privacy and security issues drive cloud vendors to ensure data is stored, processed, and managed within a specific country’s borders, leading to the need for sovereign clouds. Some customers like to have their own data center but with services available on public clouds. Independent partners must provide their customers with cloud services without the required high investment.

During the summit, Oracle shared how OCI meets customers’ needs (Figure 1 above).

  1. Oracle EU Sovereign Cloud provides cloud services with complete operational control to meet data residency and compliance requirements. The infrastructure offers flexibility through various deployment models, ensuring organizations maintain sovereignty while accessing Oracle’s cloud services, APIs, and service-level agreements (SLAs).
  2. OCI Dedicated Region offers an OCI cloud region in your data center. It provides the benefits of the OCI public cloud while giving customers control over data and applications to meet security, regulatory, and data residency requirements. OCI public cloud services enable the modernization of infrastructure and applications with a consistent platform across both public cloud and on-premises environments.
  3. Oracle Alloy is a cloud infrastructure platform that enables partners to become cloud providers, offering Oracle cloud services and customized cloud services built by the partner. This platform allows partners to customize their commercial and customer experiences, control operations, and create new services tailored to specific market needs, all while maintaining stringent security and compliance standards without significant investments.

Oracle is the only cloud vendor that provides such a set of customer options. In the long run, this strategy could help Oracle in the following ways:

  1. AI where customers need it: While customers would like to reap the benefits of AI, one issue is training it on their data. The Oracle distributed cloud options address data location and operational control of AI workloads.
  2. Feedback for improvement: Lessons learned from customer and partner experience can help Oracle prioritize improvements to OCI services.
  3. Revenue growth: Partners using OCI to develop and deliver white-label end-to-end cloud solutions to customers can bring new revenue sources to Oracle.

Conclusion: Oracle’s comprehensive OCI strategy offers a range of deployment options tailored to meet diverse customer needs, emphasizing security, privacy, and flexibility. The Seattle summit showcased OCI’s strengths, including rapid data center construction, innovative AI infrastructure, and versatile service offerings such as Oracle EU Sovereign Cloud, OCI Dedicated Region, and Oracle Alloy. These capabilities enable customers to maintain data sovereignty, ensure compliance, and leverage advanced AI solutions. Oracle’s unique approach addresses current market demands and positions the company for sustained growth by fostering customer feedback, enhancing service quality, and generating new revenue streams through strategic partnerships.

Oracle’s multi-cloud strategy differs from that of major cloud providers. Recent wins at Fujitsu and Nomura Research in Japan show the worldwide partner momentum. General Dynamics Electric Boat and Aramco are implementations for dedicated regions. Palantir is delivering AI services to the government using Oracle’s AI infrastructure. If Oracle’s distributed cloud strategy increases its market share, competitors will likely offer similar capabilities.

The post Oracle Unveils OCI Strategy: Rapid Data Center Growth, Advanced AI Infrastructure, and Versatile Cloud Solutions appeared first on Robustcloud.

]]>
Make Your Cameras Work Harder With Intelligent Video Analytics and Cloud Synergy https://robustcloud.com/intelligent-video-analytics/ Thu, 18 Apr 2024 12:31:47 +0000 https://robustcloud.com/?p=3936 Introduction: Video surveillance and monitoring is about more than capturing footage; it’s about extracting valuable insights from that data. The result is real-time actions to respond to anomalies detected from insights proactively. The computer vision industry is seeing a trend where intelligent video processing is moving to the edge of networks, all while leveraging the […]

The post Make Your Cameras Work Harder With Intelligent Video Analytics and Cloud Synergy appeared first on Robustcloud.

]]>

Introduction: Video surveillance and monitoring is about more than capturing footage; it’s about extracting valuable insights from that data. The result is real-time actions to respond to anomalies detected from insights proactively. The computer vision industry is seeing a trend where intelligent video processing is moving to the edge of networks, all while leveraging the scalability and accessibility of the public cloud. Leveraging videos on the edge is becoming a winning formula for enterprises trying to gain a competitive edge through automation. The use of cameras is widespread in consumer-facing solutions like home security as well as in purpose-built applications like self-driving vehicles. This write-up explores applying computer vision to video footage in enterprise use cases.

Figure 1: Customer Use Cases. Source: Plainsight Technologies

Examples of video footage in enterprise use cases:

      Modern manufacturing assembly lines rely on cameras for quality control and efficiency improvements through various applications. Cameras track individual process times to pinpoint bottlenecks and optimize areas. They also monitor hazardous areas to detect safety violations, such as workers not wearing protective gear or entering restricted zones. The integration of cameras in manufacturing assembly lines enhances automation, safety, and quality assurance, reflecting innovative manufacturing practices.

      Video technology has become surprisingly versatile within the agricultural sector. By increasing the use of video footage, agricultural efficiency is enhanced, benefiting livestock management and crop yield. Cameras placed in barns and pens can monitor livestock in real time. Farmers can remotely check animal health, identify potential birthing problems, and even deter theft or intrusions. Drones and other aerial vehicles equipped with cameras can capture high-resolution images and videos of vast fields. This footage allows farmers to detect areas of stress, pest infestations, or nutrient deficiencies much earlier than they would be able to on foot, enabling them to take quick and targeted actions to address the issues. The integration of video technology into agricultural operations represents a significant shift towards data-driven farming, enhancing productivity, sustainability, and resource management.

      Innovative retailers use video to enhance customer experiences, streamline operations, and gain valuable business insights. Retailers can gain valuable insights into customer behavior and preferences by analyzing foot traffic patterns, dwell times, and product interactions. This information can be utilized to optimize store layouts, product placements, and staffing levels, resulting in improved customer satisfaction and increased sales. By generating heat maps from video data, retailers can identify store areas where customers spend the most time. This enables them to strategically place high-margin products in these locations or adjust the layout to distribute foot traffic more evenly. The retail industry is poised to leverage video footage further, exploiting new opportunities for innovation and growth.

      Traffic cameras provide real-time video footage for authorities to monitor road conditions, respond to incidents, and analyze traffic patterns. This footage is primarily used to monitor and manage traffic in real time. This helps traffic operators make informed decisions to reduce congestion and optimize traffic flow, especially during peak travel or unexpected events. The application of video footage analysis enables real-time traffic monitoring, improves law enforcement for road safety, and assists accident investigations.

The Benefits of Video Processing on Edge Devices

      Latency: Real-time video analysis is essential for some applications, and edge devices provide immediate responses that require lower latency. Such applications include traffic monitoring, intrusion detection, and industrial safety.

      Privacy: Wide-ranging regulations cover the privacy requirements of data collected from videos, including purpose, consent, and notification. Edge solutions provide encryption and access control while limiting data retention to meet these regulations. Sensitive footage can be analyzed locally; only anonymized results (e.g., object counts and motion patterns) are transmitted.

      Lower bandwidth costs: Balancing low latency with high bandwidth costs is challenging when processing videos on the cloud. Only crucial insights and metadata are transmitted instead of continuously streaming raw video to the cloud.

      Resilience: Edge analytics can operate without a constant cloud connection, ensuring continued functionality in areas with intermittent connectivity.

The Value of Public Cloud

      Scale: Cloud computing enables businesses to scale storage and compute resources quickly and efficiently. This flexibility enhances performance and accessibility, improving operational efficiency and supporting business growth and innovation.

      Analytics: Leveraging cloud computing resources for analytics offers many benefits like scalability and elasticity that can significantly enhance an organization’s data analysis capabilities. In addition, public cloud platforms offer advanced analytics tools and algorithms that can be accessed as services, enabling organizations to apply sophisticated data analysis techniques without needing in-house expertise.

      Remote access: Public cloud facilitates collaboration by providing centralized data storage and access, allowing teams to work together effectively, regardless of their physical location.

The Plainsight and SoftIron Partnership

Video surveillance has evolved from simple capture and storage to proactive, insight-driven action. Intelligent video processing has moved to the network edge to achieve this transformation while leveraging scalability and public cloud access. The combination of edge and cloud is crucial for businesses looking to unlock the full potential of video data for efficiency, security, and innovation. Plainsight and SoftIron have partnered to provide edge processing and flexible cloud connectivity. Some benefits customers gain from the two companies are as follows:

1. Real-Time Action from Edge Insights: Plainsight allows businesses to make quick decisions based on visual insights. Their AI-powered ‘Filters’ can be deployed on any platform that supports containerized workloads. These filters can be customized and act as digital eyes that analyze live camera feeds. SoftIron’s HyperCloud is a suitable edge solution for this purpose. It is a private cloud that provides fast local processing crucial for real-time decision-making and automated responses.

2. Data Sovereignty: Privacy and Control: The partnership between Plainsight and SoftIron emphasizes security and data ownership. Plainsight Filters are containerized to keep customer data on-premises and protected using offline training. SoftIron’s HyperCloud enables this within a secure, private cloud, making it applicable for sensitive applications.

3. The Power of Scalability: This solution is designed to expand with growing needs. The SoftIron HyperCloud has a modular architecture and Plainsight’s Kubernetes-based Filters, allowing easy scaling. Plainsight Filters, with their cloud-native design, can be deployed across SoftIron’s HyperCloud nodes in a private package optimized for the network edge to analyze additional video feeds.

4. Edge Deployment Made Easy: Plainsight and SoftIron’s IT-friendly solutions allow small teams to deploy and manage the video intelligence pipeline without deep machine learning or cloud infrastructure knowledge.

5. When the Physical and Digital Converge: Plainsight Filters are developed to imitate human vision and decision-making process, providing digital representation of trained human insight to physical locations where it’s required. This capability allows businesses to install these Filters in different environments if a camera is connected. The flexible architecture of Plainsight’s FilterBox, which is not limited to specific hardware, when combined with SoftIron’s HyperCloud’s scalable infrastructure, facilitates the deployment of vision AI in various settings, from factory floors to remote outdoor locations.

Challenges:
 
1. Combination with Existing Infrastructure: Integrating new technology like edge processing and analytics with legacy camera systems or network infrastructure can be complex. Compatibility issues, network configuration changes, and balancing old with new requirements can create unexpected hurdles during deployment.

2. Data governance: Organizations should define clear data governance policies around data retention, access controls, and compliance with privacy regulations, as policies may vary depending on the industry and region.

3. Storage: It is crucial to balance local processing needs with cloud storage for edge devices. This requires thoughtful architecture and capacity planning.

Summary:

Overall, integrating video analytics, edge computing, and cloud capabilities offers enterprises a powerful tool to improve efficiency, security, and innovation across various industries. The path for automation and insights using video footage can only be solved partially with edge or cloud solutions but is ideally suited with a hybrid approach. Potential hurdles include integrating legacy systems, maintaining data governance, and balancing edge and cloud storage.

The collaboration between Plainsight and SoftIron exemplifies the synergy between edge and cloud computing, offering businesses a scalable, secure, and efficient solution for video data processing. Plainsight’s AI-powered Filters and SoftIron’s HyperCloud enable real-time, actionable insights while ensuring data sovereignty and easing deployment challenges. This partnership addresses the complexities of integrating cutting-edge technology with existing infrastructures, navigating data governance, and balancing processing needs with storage requirements. Through this blend of edge and cloud computing, enterprises are equipped to unlock the full potential of video data, paving the way for a future where digital insights drive physical world actions, fostering efficiency, security, and innovation across various sectors.

About Plainsight (Source: Plainsight)
Plainsight Technologies is the comprehensive vision AI factory. Plainsight Vision Intelligence Filters let companies of all sizes see more business as they measure and automate quality control, production yield, compliance, inventory, and other critical business operations using their existing camera feeds. Filters are containerized computer vision applications composed to solve specific business problems that run on FilterBox, our proprietary Runtime Engine. Learn more at plainsight.ai

About SoftIron: (Source: SoftIron)
SoftIron is a Silicon Valley-based worldwide leader in true private cloud infrastructure. HyperCloud by SoftIron allows organizations to build a true private cloud on-premises that deploys, manages and consumes like public cloud. HyperCloud provides the elasticity of cloud in a solution that is fast and simple to deploy, driving extreme agility. HyperCloud delivers the benefits of AWS Outposts or Azure Stack HCI but in a cloud-neutral solution. Learn more at www.softiron.com

The post Make Your Cameras Work Harder With Intelligent Video Analytics and Cloud Synergy appeared first on Robustcloud.

]]>
KubeCon EU 2024 Report: The Rise of Kubernetes, Licensing Wars, and AI https://robustcloud.com/kubecon-eu-2024/ Wed, 10 Apr 2024 17:47:20 +0000 https://robustcloud.com/?p=3925 Introduction: KubeCon EU 2024 was held in Paris from March 19th to the 22nd, with a record number of more than 12,000 attendees. After its formation in 2015, the Cloud Native Computing Foundation (CNCF) has 183 graduates, incubating and sandboxing projects, and 233K contributors. AI was front and center throughout the conference, with the Cloud […]

The post KubeCon EU 2024 Report: The Rise of Kubernetes, Licensing Wars, and AI appeared first on Robustcloud.

]]>
Introduction: KubeCon EU 2024 was held in Paris from March 19th to the 22nd, with a record number of more than 12,000 attendees. After its formation in 2015, the Cloud Native Computing Foundation (CNCF) has 183 graduates, incubating and sandboxing projects, and 233K contributors. AI was front and center throughout the conference, with the Cloud Native AI Day as a CNCF-hosted Co-located event and an AI Hub Unconference. This is a short write-up of some observations at the event that are leading to continued innovation in the open-source community.
Figure 1: Crowds at KubeCon EU 2024. Image Courtesy Linux Foundation

The Move to Closed Source: RobustCloud had previously written about Hashicorp’s unexpected shift of Terraform’s license to the Business Source License. Under the auspices of the Linux Foundation, a fork named OpenTofu resulted, and the CNCF-hosted OpenTofu co-located event was packed. At the same time, HashiCorp lost its co-founder and is reportedly looking for a buyer, leading to customer uncertainty.
During the conference on March 20th, Redis Labs announced it is adopting a dual-source licensing model with RSALv2 and SSPLv1. This model will allow continued unrestricted use of the source code while requiring cloud service providers to obtain a commercial license. While Microsoft committed to a license from Redis Labs, the reaction from the Linux Foundation was swift in forming Valkey, an open-source alternative to the Redis in-memory, NoSQL data store. With support from major cloud service providers, including Amazon Web Services (AWS), Google Cloud, and Oracle, it remains to be seen if Valkey will win the battle for future innovation.

The Cost of AI: Users of AI, ranging from enterprises to startups, must keep a close eye on costs to ensure that the profitability of delivering AI services holds up the business model. During the conference, it was evident that AI workloads ran on Kubernetes. An open-source add-on to Kubernetes called Karpenter is an auto scaler that monitors workloads and adjusts cluster sizes appropriately. Since its announcement in 2021, Karpenter’s use has expanded into AI, with customers like Anthropic demonstrating a 40% cost reduction of AWS spending using the add-on.  Another AWS customer, Grafana Labs, showed how they run services on every hyper scaler but migrated from the Cluster Autoscaler to Karpenter due to the ability to control the optimal nodes for the workload to run on.

With the success of Karpenter, other cloud providers will likely develop and deliver similar tools. As companies evaluate retiring on-premise workloads and moving them to the cloud using Kubernetes, Karpenter can assist in better resource utilization and reduced costs.

Kubernetes Since 2014: At the last day’s keynote, Bob Wise, CEO of Heroku, reminisced about the beginnings of cloud-native from Docker and Kubernetes being open-sourced and the launch of the Cloud Native Computing Foundation (CNCF) in July 2015. Bob also gave a shout-out to Dan Cohn, who shepherded the Open Container Initiative to develop open standards for container formats and runtimes.

One service Bob mentioned was the Amazon Elastic Kubernetes Service (Amazon EKS) launched in 2018, which he credited to CNCF’s open governance model. An interesting observation is that although Amazon Web Services (AWS) was the last among hyperscalers to offer a managed Kubernetes service in 2018, AWS had more global container workloads than any cloud provider in 2021 (Source: CNCF Annual Survey 2021). The success of Kubernetes on AWS reflects the desire of app developers to build and deploy modern application architectures on an abstracted infrastructure.

Later in the keynote, Solomon Hykes, the CEO of Dagger.io, spoke about the future of application delivery in a containerized world. He covered the launch of Docker in 2013, which initiated the container revolution. However, the journey included a Platform-as-a-Service (PaaS) offering that has evolved into hyperscalers dominating the container market with a comprehensive collection of services. Before embarking on an application development journey, developers now choose the best services that solve specific business problems. As a result, infrastructure as a service is less relevant, and cloud providers are selected based on the variety of services available to the developer.

Conclusion: KubeCon EU 2024 highlighted significant shifts in the cloud-native landscape, including licensing changes and using Kubernetes as the primary delivery mechanism for AI workloads. The trend toward closed-source licensing models raises questions about the future of open-source collaboration, motivating the formation of projects like OpenTofu and Valkey. The collective response to closed-source transitions reveals a strong community inclination towards open-source solutions, underscoring the resilience and adaptability of the cloud-native landscape.  At the same time, the conference highlighted the crucial role of cost management in AI deployments, with open-source tools like Karpenter emerging as essential for efficiency. As Kubernetes continues its rapid growth across cloud providers, its success mirrors the broader shift towards abstracted infrastructure and developer-centric solutions. Looking ahead, the cloud-native ecosystem promises continued innovation, with the challenge for developers lying in choosing the best combination of services to address their unique needs.

The post KubeCon EU 2024 Report: The Rise of Kubernetes, Licensing Wars, and AI appeared first on Robustcloud.

]]>
The End of Human-Centric Decision-Making: Gen AI is Changing the Game by Transforming Recommendations https://robustcloud.com/the-end-of-human-centric-decision-making/ Mon, 04 Mar 2024 23:04:28 +0000 https://robustcloud.com/?p=3916 Introduction: In the era of COBOL, business rules and decisions were intricately woven into the fabric of the program itself. As COBOL programming methodologies advanced, subroutines and copybooks became popular tools for organizing and reusing code. However, even minor changes required the program to be recompiled, a time-consuming and resource-intensive process. The landscape stayed relatively […]

The post The End of Human-Centric Decision-Making: Gen AI is Changing the Game by Transforming Recommendations appeared first on Robustcloud.

]]>

Introduction: In the era of COBOL, business rules and decisions were intricately woven into the fabric of the program itself. As COBOL programming methodologies advanced, subroutines and copybooks became popular tools for organizing and reusing code. However, even minor changes required the program to be recompiled, a time-consuming and resource-intensive process.

The landscape stayed relatively the same with the rise of client-server technologies, with C, C++, or Java becoming the preferred programming languages. While these languages offered improvements in terms of efficiency and versatility, the need for human intervention in the coding process remained constant.

A significant paradigm shift occurred with the inception of rules engines. These innovative tools enabled the externalization of rules and decision trees, allowing for dynamic invocation and greatly enhancing the flexibility of programming. Yet, despite this advancement, the responsibility of deliberating, deciding, and coding the necessary rules still fell on the shoulders of human domain experts and programmers.

The game’s rules are being rewritten with the advent of Generative AI (Gen AI). Gen AI has the potential to build and modify rules dynamically as new data becomes available, marking a departure from traditional methods. This shift heralds a new era in decision-making models, transitioning from a ‘human-centric’ approach to a ‘data-centric’ one.
Figure 1: A humanoid robot representing Generative AI technology. Source: ChatGPT, DALL_E

The Impact of Gen AI on Recommendation Engines:

Integrating Gen AI into recommendation engines promises to bring about several transformative changes. These include:

Revolutionizing Offers, Deals, and Promotions: Businesses can transition from human-centric decision models to data-centric ones by leveraging Gen AI. This shift allows for more personalized and dynamic offers, deals, and promotions tailored to user preferences and behaviors.

Reducing Effort in Analysis and Reviews: With Gen AI, the need for extensive analysis and reviews of decision models can be significantly reduced. The AI can automatically learn from the data, identify patterns, and make decisions, thereby streamlining the process.

Automating Rules Generation and Execution: Gen AI can automate the process of rules generation and execution. This automation increases efficiency and reduces the potential for human error.

Automating Coding: Gen AI can automate the coding process, further reducing the need for human intervention and accelerating the development cycle.

Benefits: Gen AI provides businesses with the ability to enhance their recommendation engines, resulting in a range of benefits:

Hyper-personalization: Businesses can deliver tailored recommendations and offers aligned with individual interests and requirements by analyzing customer data in detail.

Real-time Adaptability: Businesses can respond to changing customer behavior and market trends in real time, ensuring that their offers and promotions remain relevant and practical.

Elevated Customer Experience: By providing personalized interactions, businesses can delight customers, demonstrate understanding of their preferences, and establish brand loyalty while increasing conversion rates.

Challenges: Gen AI in recommendation engines presents unique challenges that must be addressed for effective implementation:

Data Integrity and Bias: Gen AI models must be trained on diverse, high-quality datasets to avoid biased or inaccurate recommendations. Continual monitoring and refinement are crucial.

Explainability:  While Gen AI can automate decisions, understanding the reasoning behind recommendations is essential. Techniques to ensure explainability build trust and aid in troubleshooting.

System Integration:  Incorporating Gen AI into existing processes often requires significant changes. A smooth transition requires a thorough impact analysis of systems, finances, and organizational structure.

Hallucinations:  Gen AI models can sometimes produce incorrect or misleading outputs. Robust safeguards and quality control processes are needed to mitigate these “hallucinations.”

The example of Google Gemini’s inaccurate and offensive image generation underscores the importance of ensuring data quality and model explainability to maintain confidence and protect organizational reputation.

Conclusion: Implementing Gen AI in programming and decision-making processes is a significant advancement that can increase efficiency and improve personalization while decreasing human intervention. However, it is crucial to carefully consider the quality of data used, the ability to explain the decision-making process, and the overall impact on the organization to take advantage of its benefits and address its challenges fully.

Although Gen AI technologies are currently in their early stages, they are rapidly developing and hold immense potential for the future. These technologies can significantly reshape decision-making models and recommendation engines in the coming months. This will bring about a new era of automated, data-driven decision-making. As we continue to explore and harness the power of Gen AI, we can expect to see more efficient, accurate, and personalized recommendation engines than ever before.

The post The End of Human-Centric Decision-Making: Gen AI is Changing the Game by Transforming Recommendations appeared first on Robustcloud.

]]>
Unleash AI Insights: Salesforce Einstein 1 Now Reads Your PDFs & Emails https://robustcloud.com/unleash-ai-insights-salesforce/ Wed, 03 Jan 2024 19:02:53 +0000 https://robustcloud.com/?p=3874 In 2006, Clive Humby, the UK Mathematician and a founder of Dunnhumby coined the term “Data is the new oil,” which has been used numerous times in quotes by others since then. While Clive Humby did not mention the type of data, in the age of generative AI, unstructured data plays a crucial role in […]

The post Unleash AI Insights: Salesforce Einstein 1 Now Reads Your PDFs & Emails appeared first on Robustcloud.

]]>
In 2006, Clive Humby, the UK Mathematician and a founder of Dunnhumby coined the term “Data is the new oil,” which has been used numerous times in quotes by others since then. While Clive Humby did not mention the type of data, in the age of generative AI, unstructured data plays a crucial role in training large language models (LLMs). Integrating Generative AI prompt responses with structured data from systems of record gives organizations an unprecedented capability for automation. Considering this critical customer requirement, Salesforce announced unstructured data support in its Einstein 1 platform at its world tour event held in NYC on December 14, 2023 (See Figure 1 below). This announcement follows the Salesforce data cloud news at its annual conference at the Moscone Center in San Francisco in September 2023. This blog post will cover how Salesforce leverages the overall Data Cloud portfolio across its Generative AI portfolio and its unstructured data support, followed by a summary.
Figure 1: Unstructured Data Support in the Einstein 1 Platform Source: Salesforce

Generative AI at Salesforce

Salesforce is primarily a Software as a Service (SaaS) and Platform as a Service (PaaS) vendor. For SaaS, embedding Generative AI functionality applications immediately delivers capabilities like conversational AI to end-users. For PaaS, the ability to build Generative AI applications trained on customer data accelerates the ability of developers to create customer applications for their end-users.

The value of Generative AI comes from foundational models trained on large amounts of data. Uncontrolled data ingestion from open Internet sources can raise issues like Intellectual Property (IP) rights and biases from results. The New York Times recently filed a lawsuit against OpenAI and Microsoft over copyright infringement. These risks lead customers to choose vendors that deliver Generative AI value on models trained on private data. Using customer data helps decision-making and automation unique to each business situation. For example, models trained on enterprise knowledge bases can be used to determine customer sentiment accurately and recommend specific actions.

The underpinning of Salesforce’s AI strategy is the Salesforce Data Cloud, which processes massive volumes of data and connects billions of records daily to create unified customer profiles. This overcomes the challenges brought by islands of disconnected data into one source that allows AI to deliver a consistent experience. The Salesforce Data Cloud has an open ecosystem enabling customers to get their lake, like Snowflake or Databricks, or models from fully managed AI/ML platforms models, like Vertex AI and AWS Sagemaker on Google Cloud and AWS. In addition, Salesforce Data Cloud is integrated with first-party advertising like Meta and partner apps through AppExchange.

Unstructured Data in Einstein 1 platform

At the World Tour NYC, Salesforce announced two capabilities that enhance leveraging unstructured data described below.

The Data Cloud Vector Database: The Einstein 1 Platform brings together Data, AI, CRM, Development, and Security to facilitate the development of generative apps and automation. Data Cloud Vector Database will remove the need to fine-tune LLMs by using business data to enrich AI prompts.  This allows customers to use diverse data types generally prevalent across their business.  This could increase ROI on technology investments by having a single view of all of the enterprise’s structured and unstructured data (PDFs, emails, documents, and transcripts, with structured data, including purchase history, customer support cases, and product inventory) to power AI, automation, and analytics across applications. One example is boosting efficiency and improving customer satisfaction by presenting relevant knowledge to service agents.

These capabilities are easily applicable within the Salesforce ecosystem but can be extended to other applications using Salesforce’s integration capability through MuleSoft.

Einstein Copilot Search: Salesforce launched Einstein Copilot as an AI-powered assistant meant to be used across every Salesforce application with customization capability powered by Einstein Copilot Studio. Einstein Copilot Search, available along with Einstein Copilot in February 2024, will enhance AI search capabilities by sifting through diverse data sources. This will provide internal and external stakeholders with an AI assistant to solve problems by looking through real-time customer structured data and other relevant unstructured data. Depending on the usage, this will help realize greater internal efficiencies and higher customer satisfaction with an AI assistant that understands and addresses complex queries by accessing previously unattainable insights and knowledge.  Einstein Copilot Search also provides citations to the source material. Salesforce’s Einstein Trust Layer builds trust and confidence in AI-generated content while maintaining data governance and security.

The Salesforce AWS partnership: Any AI offering requires a robust cloud infrastructure for support. Salesforce has a long collaboration with AWS, beginning in 2016 with an infrastructure partnership to support customer data sovereignty requirements. An expanded partnership was announced at Amazon’s re:Invent 2023 and reiterated at the World Tour in NYC that connects Salesforce’s Einstein, Data Cloud, Service Cloud, and Heroku to AWS services. Salesforce now leverages Amazon’s Bedrock through the Einstein Trust Layer. The Salesforce Data Cloud has zero-ETL data integrations with AWS data services like Amazon Athena, Amazon Aurora, and Amazon Redshift. 

Summary

Since ChatGPT launched GPT- 3.5 in November 2022, Gen AI has garnered the mind share of the public and companies alike.  Companies across the globe are looking for ways to infuse Gen AI into their business environment to increase employee productivity and improve customer satisfaction while growing revenue streams with new business models. 

These Data Cloud and Einstein announcements will provide tools to existing Salesforce customers to integrate structured and unstructured data to develop new ways of combining and visualizing data. The expanded AWS partnership gives Salesforce new capabilities to meet customer requirements. But with solid business use cases, they might reach the hype curve.  For these new offerings to find traction, Salesforce must quickly develop user scenarios by industry.

Ram Viswanathan, Consultant and ex-IBM Fellow, contributed to this blog post.

The post Unleash AI Insights: Salesforce Einstein 1 Now Reads Your PDFs & Emails appeared first on Robustcloud.

]]>
Past, Present, and Future of WebAssembly (Wasm) and What It Means for the Industry https://robustcloud.com/past-present-and-future-of-wasm/ Sat, 30 Dec 2023 23:28:24 +0000 https://robustcloud.com/?p=3862 This blog post is a follow-up to “Is WebAssembly the Next Big Client Technology Ready to Unseat Time-Tested JavaScript? “ published on November 3rd, 2023. Introduction: WebAssembly (Wasm), a game-changer for web development, is coming to overcome the JavaScript drag. This binary-powered virtual machine lets you code in languages like C++, Rust, and more, then […]

The post Past, Present, and Future of WebAssembly (Wasm) and What It Means for the Industry appeared first on Robustcloud.

]]>

This blog post is a follow-up to “Is WebAssembly the Next Big Client Technology Ready to Unseat Time-Tested JavaScript? “ published on November 3rd, 2023.

Introduction: WebAssembly (Wasm), a game-changer for web development, is coming to overcome the JavaScript drag. This binary-powered virtual machine lets you code in languages like C++, Rust, and more, then unleashes them at near-native speeds. Secure, portable, and packed with benefits, Wasm is turbocharging web apps from games to machine learning, and the future holds even more possibilities. This blog post covers the benefits of Wasm, the applicability of Wasm to different solutions followed by future trends, and a summary.
Figure 1: The future of Wasm created by DALL-E

The benefits: Wasm is a binary code for compilation in various target programming languages. This enables the running of server executable code on disparate client devices. Wasm offers several advantages over other web technologies, such as JavaScript:

     Performance: Wasm code can execute at near-native speed and significantly faster than JavaScript code. This is because Wasm code is compiled into machine code, while JavaScript code is interpreted by the browser’s engine, which is slower.

     Portability: Wasm modules can be compiled from various programming languages, including C, C++, Rust, and AssemblyScript. This makes it possible to run code written in these languages on the web without having to rewrite it in JavaScript. This is a significant advantage for developers, as it allows them to use the programming language that they are most comfortable with and that is best suited for the task at hand.

     Security: Wasm modules are sandboxed, meaning they cannot access the browser’s DOM or other resources without permission. This makes Wasm modules more secure than JavaScript code, which can access the browser’s entire environment. This is important for protecting users from malicious code.

In addition to these advantages, Wasm also offers several other benefits, such as:

     Size efficiency: Wasm modules are typically much smaller than JavaScript code. This makes them faster to download and execute, which is essential for web applications.

     Ecosystem: Wasm is supported by a growing ecosystem of open-source tools and libraries. This makes it easier for developers to start with Wasm and build high-quality Wasm applications.

Applicability: Wasm is still a relatively new technology but is quickly gaining popularity among web developers. It is already being used to power a wide range of web applications, including:

     Games: Wasm is being used to develop high-performance web games, such as Dota Underlords, Godot, and Roblox.

     Media and Entertainment: Wasm is used to develop powerful image and video editing tools, such as Paintbrush, FFmpeg, and Adobe Photoshop Express Web.

     Machine Learning: Wasm is used to develop machine learning applications that can run in the browser, such as TensorFlow.js and PyTorch.js.

     Serverless Computing: Wasm develops serverless applications like Cloudflare Workers and AWS Lambda@Edge.

     Distributed Systems: Wasm is being used to develop distributed systems like WasmCloud and Agones.

In addition to these traditional web applications, Wasm is also being used to develop new and innovative types of applications, such as:

     Augmented Reality (AR) and Virtual Reality (VR) applications: Wasm can be used to develop AR and VR applications that run in the browser without the need to install a native application. This makes AR and VR more accessible to a broader range of users.

     Blockchain applications: Wasm can be used to develop blockchain applications that run in the browser. This makes developing and deploying decentralized applications (DApps) easier.

 

Future trends: Here are some additional trends in Wasm development to watch for in the future:

     More languages: More programming languages are being supported by Wasm compilers, making it even easier to develop Wasm modules. For example, there are now Wasm compilers for languages like Go, Swift, and Kotlin.

     Better tooling: Improved tooling makes developing, debugging, and deploying Wasm modules easier. For example, IDE plugins and debuggers now support Wasm development.

     New use cases: Wasm is being used in new and innovative ways, such as to develop new types of web applications, AR and VR applications, and blockchain applications.

 

Summary: Wasm’s impact is undeniable. Wasm is not just a JavaScript alternative; it’s a gateway to a future where speed, security, and portability reign. From blazing-fast games to AI-powered experiences, Wasm unlocks untapped potential on the web. Its ability to embrace diverse languages and use cases hints at a boundless future where developers can craft without boundaries and users enjoy unprecedented performance. As the ecosystem matures and open-source tools flourish, keep your eyes peeled for Wasm’s next grand act, which has only begun.

Ram Viswanathan, Consultant and ex-IBM Fellow, co-authored this piece with Larry Carvalho, Principal Consultant, RobustCloud.

The post Past, Present, and Future of WebAssembly (Wasm) and What It Means for the Industry appeared first on Robustcloud.

]]>