Powered by RND
PodcastsTechnologyEnergy Transformation

Energy Transformation

Siemens Digital Industries Software
Energy Transformation
Latest episode

Available Episodes

5 of 14
  • Harnessing the Power of Knowledge Graphs for AI - Part 2
    In Part 2 of this series on Knowledge graphs, we discuss how these powerful tools are transforming industries, especially those grappling with vast amounts of complex data, like energy, chemicals, and infrastructure. The discussion kicks off by highlighting a common challenge for field engineers: the sheer difficulty of gathering comprehensive information about a single asset, like a flange connection. Historically, this has been a "laborious task," often involving sifting through mountains of paper manuals, retyping information, and manually linking disparate systems. This complexity has often slowed down digitalization in these industries compared to manufacturing. Bill Hahn points out the immense scale of data, particularly in areas like upstream oil and gas, where geological databases can contain billions of data nodes. This is where RapidMiner steps in, with knowledge graphs offering a revolutionary approach: Handling Massive Data: RapidMiner is designed to manage tens to hundreds of billions of data nodes, performing complex queries in seconds or minutes, not days. This speed is achieved through in-memory processing and optimized relationships. Virtual Integration: Instead of painstakingly migrating all data into a single, massive data lake, knowledge graphs create a "virtual web of relationships" across existing data sources. This means data stays where it is, but its connections are made intelligently in memory. Empowering AI: Ben Secondly explains that knowledge graphs provide the "dense, information-rich context" that AI agents and large language models (LLMs) crave. This ensures that AI responses are more accurate and reliable, reducing the "hallucinations" that can occur when LLMs guess relationships in uncontextualized data. It's like giving AI a super-smart librarian for all your company's information! A key clarification is that knowledge graphs don't replace existing system integrations. Instead they are complementary. Existing integration tools can be used to "surface" data into the knowledge graph's ontology, creating a higher-level data integration plane. This means companies can leverage their prior investments while gaining new, powerful capabilities. An instructive analogy can be used here: browsing a webpage. When you visit a website, your browser brings a version of that data into memory. You're not downloading the entire web server! Similarly, a knowledge graph pulls a view of data from systems like ERP or PLM into memory, allowing you to ask questions and run reports without migrating the entire database. You can "refresh" this view to get the latest information. Perhaps the most exciting takeaway is the promise of rapid time-to-value. Practically speaking, organizations can start seeing value from RapidMiner in weeks. This is a dramatic shift from the "months or years" timelines often associated with large-scale data projects. The flexibility of the ontology means you don't need new development cycles for every new question, accelerating insights and decision-making.
    --------  
    17:24
  • Harnessing the Power of Enterprise Knowledge Graphs for AI and Analytics
    This episode features Siemens veterans John Nixon and Bill Hahn, along with Ben Szekely coming from the Altair family (now part of Siemens). They discuss how Altair's RapidMiner platform fits into Siemens' portfolio and the broader AI and data ecosystem. Key points: RapidMiner provides a low-code platform for machine learning and data analytics, helping a wide range of users deploy AI models quickly. RapidMiner also includes enterprise knowledge graph capabilities, which allow companies to integrate and contextualize diverse data sources without having to move all the data into a centralized platform. The knowledge graph creates a virtual "blanket" over a company's existing data systems, automatically mapping relationships between data points from different sources (e.g. linking maintenance records to asset performance data). This contextualized data provides a stronger foundation for AI models, which tend to perform better with rich, interconnected data versus just accessing a data lake. RapidMiner enables faster time-to-value compared to traditional approaches of manually integrating data or building a monolithic data warehouse. A key use case is in maintenance and operations, where RapidMiner can quickly pull together all relevant data about an asset (from manuals to sensor data) to empower field technicians and engineers. Overall, the discussion highlights how RapidMiner's capabilities around data integration, knowledge graphs, and low-code AI can help Siemens customers accelerate their digital transformation and get more value from their existing data and systems.
    --------  
    24:16
  • The Resurgence of Nuclear Energy - Part 2
    In part 2 of our Nuclear Energy Series, our panel delves into the critical role of supply chain management, data orchestration, and digital twins in shaping the future of nuclear power. The conversation highlights how these innovations are not only enhancing the safety and reliability of nuclear operations but also unlocking new possibilities for optimization and predictive maintenance.  Key highlights of the discussion include: Digitalization and data orchestration are crucial to enable a "nuclear renaissance" Digital twins and advanced analytics (AI, ML, physics-informed models) enable: Simulation of "what-if" scenarios to predict and prevent failures Informed, real-time predictions of asset degradation and maintenance needs New nuclear reactor designs are prioritizing maintainability and serviceability to address historical challenges Reduced order models and edge computing are empowering near real-time simulation and decision-making for operators Collaboration between nuclear operators, regulators, and technology providers is crucial to accelerate the adoption of these transformative digital capabilities
    --------  
    17:29
  • From the Classroom to the Field: Empowering the Next Generation of Engineers
    In this episode, Dora Smith and John Nixon of Siemens discuss the critical trends and challenges in academia and workforce development, particularly in the energy, chemicals, and infrastructure sectors. Our guests highlight several key challenges facing these industries: The rapid growth of digital technologies like AI, machine learning, and the industrial metaverse is creating a major skills gap. Workers need to develop “digital fluency” to leverage these new tools effectively. There is a significant labor shortage, with high demand for skilled workers in fields like welding, pipefitting, and data center operations to support growing energy needs. Traditionally, academia has struggled to keep curricula at pace with industry needs. However, this is starting to change, with 60% of universities now looking to supplement degrees with industry-created micro-credentials. Dora and John also explain how micro-credentials can play a vital role in bridging this skills gap. These bite-sized, industry-recognized certifications allow for more agile, targeted upskilling than traditional 4-year degrees. Micro-credentials can be integrated into degree programs, creating clear pathways for students. We also learn how Siemens is developing internal micro-credential programs to upskill their own workforce. This helps ensure workers have the necessary digital skills and to leverage emerging technologies in the field.
    --------  
    20:01
  • The Resurgence of Nuclear Energy in the Digital Age
    In this episode, our panel of experts shed light on exciting developments and challenges within nuclear energy. Nuclear energy is increasingly recognized as vital for ensuring energy security and stability. The growing energy demand from Artificial Intelligence (AI) and data centers drives the need to accelerate nuclear plant design and deployment. Our guests highlight the complexities behind large-scale nuclear projects and the imperative to bring advanced reactor technologies to the grid. Featured Guest – Emilio Baglietto, Associate Department Head and Professor of Nuclear Science and Engineering at MIT. AI and the growth of data centers are reshaping the energy sector. By 2030-2035, data centers are projected to consume 9% of US energy. This surge in demand suggests a reevaluation of how nuclear facilities are designed and constructed. Traditionally, facility construction spans 7-10 years, a timeline that must be accelerated to meet the rapid pace of innovation. What You’ll Learn in this Episode: AI and data centers – the new drivers of nuclear demand Challenges with past nuclear projects Is a nuclear renaissance ahead? Digitalization – the engine to redesign nuclear projects, manage lifecycles, and foster better collaboration with the supply chain Use of technology to meet safety requirements, ensure timely project completion and reduce rework The need for nuclear facilities to accelerate their design processes to meet capacity demands Connect with Emilio Baglietto:  LinkedIn Connect with John Nixon:  LinkedIn Connect with Greg Elliot: LinkedIn Connect with Matt McCullum:  LinkedIn
    --------  
    13:21

More Technology podcasts

About Energy Transformation

by Siemens Digital Industries Software
Podcast website

Listen to Energy Transformation, Cheeky Pint and many other podcasts from around the world with the radio.net app

Get the free radio.net app

  • Stations and podcasts to bookmark
  • Stream via Wi-Fi or Bluetooth
  • Supports Carplay & Android Auto
  • Many other app features

Energy Transformation: Podcasts in Family

Social
v7.23.12 | © 2007-2025 radio.de GmbH
Generated: 11/20/2025 - 2:36:53 AM