ELE Times

Subscribe to ELE Times потік ELE Times
Адреса: https://www.eletimes.ai/
Оновлене: 1 хв 58 секунд тому

R&S drives connections and innovations at MWC Barcelona 2026

Птн, 02/06/2026 - 08:57

Rohde & Schwarz will exhibit its extensive portfolio of next generation of wireless technologies, under the motto, “Enabling Connections, Empowering Innovations”, at the Mobile World Congress 2026 in Barcelona, Fira Gran Via, hall 5, booth 5A80 from March 2 to 5, 2026.

The path from 5G to 6G

For a seamless evolution from 5G to 6G, Rohde & Schwarz offers future-ready test solutions for mobile devices and networks. Among the many innovative solutions, the CMX500 one-box signalling tester stands out throughout multiple demos, addressing today’s and tomorrow’s testing challenges.

  • Paving the way for 6G, Rohde & Schwarz showcases carrier aggregation combining FR1 and FR3 frequency ranges with its CMX500 one-box signalling tester. The demonstration validates end-to-end device behaviour across the aggregated spectrum. FR3 (7.125 to 24.25 GHz) has been identified by industry and research as a “sweet spot” for combining wide-area coverage with high capacity. Equipped with the new, upgradeable RFU18 board for the CMX500, the tester covers up to 18 GHz, giving users enough headroom for FR3 evolution and a future-ready path for testing next-generation networks.
  • Another setup addresses virtual signalling testing. Based on the CMX500, Rohde & Schwarz demonstrates a new approach of shift-left testing, allowing R&D engineers to find design flaws early in their mobile radio modem chips before costly silicon fabrication. This early SDR-based validation will significantly cut time-to-market for 6G devices.
  • Ray tracing simulates real-world signal propagation environments, making it a valuable technique for AI receiver testing for future 6G devices. Rohde & Schwarz will showcase the CMX500 as it creates a digital twin of signal propagation within its test environment by leveraging the VIAVI™ ray tracing engine. This enables controlled and reproducible validation of complex scenarios with high measurement precision, facilitating site-specific optimisation of radio links and reducing the need for tedious field tests.
  • Rohde & Schwarz also advances 5G and emerging 6G testing with its AI-based toolset AI Workplace for the CMX500, massively enhancing testing productivity. TechAssist uses natural language to control the CMX500, enabling rapid test-scenario setup and status/configuration queries, while an upgraded ScriptAssist with a new interface simplifies and accelerates scripting for R&D protocol and application testing as well as instrument automation. Visitors can experience these AI-powered tools in action within various setups at MWC 2026.
  • Mobile XR and personal AI devices like smart glasses and wearables are key for 5G-Advanced and 6G-enabled immersive 3D communications. Delivering compelling, low-latency experiences will require rigorous, realistic testing. Rohde & Schwarz will demonstrate an end-to-end testbed centred around the CMX500, addressing AI on RAN and XR testing challenges with its ability to emulate 4G, 5G and Wi-Fi networks, applying both RF and IP impairments to reproduce real-world conditions such as interference and congestion.
  • 6G ISAC (Integrated Sensing and Communication), which leverages mobile networks for object detection, is rapidly gaining traction. Rohde & Schwarz will demonstrate new capabilities of its R&S AREG800A, including the emulation of micro-Doppler signatures – in addition to distance, speed and RCS – to support object classification, such as drones.
  • For testing base stations and network infrastructure, Rohde & Schwarz showcases the PVT360. It meets the requirements for testing FR1/FR2, small cells and O-RU in a single box. For the verification of frequency converting antennas used in SATCOM, NTN or 5G and 6G applications, visitors can learn about CATR-based over-the-air test chambers, enabling fast OTA-testing of phased antenna arrays.
  • With the first off-the-shelf commercial mobile devices now available for 5G broadcast, Rohde & Schwarz lets visitors explore seamless rich data distribution transmission to mobile devices, innovative applications like venue casting, emergency alerts and advanced solutions for terrestrial positioning, navigation and timing.

From ground to orbit with NTN

As terrestrial and satellite-based networks converge, it becomes increasingly complex to simulate real-world conditions while meeting 3GPP requirements, for instance, when it comes to handovers within orbits, between orbits or from space to ground. As NTN technology matures alongside 5G and towards 6G, overcoming significant technical hurdles is key to realising NTN’s potential.

  • Rohde & Schwarz has upgraded its CMX500 one-box signalling tester, supporting NR-NTN, NB-NTN and Direct-to-Cell (D2C/DTC) technologies in a single platform. The tester creates a digital twin of the sky, simulating orbits, bands and impairments like Doppler shifts and fading. Combined with smart features like the Constellation Insights Tool, it allows engineers to visualise satellite constellations, analyse coverage gaps and observe trajectories.
  • Rohde & Schwarz also supports NTN conformance and carrier acceptance testing, offering the highest number of validated test cases for NR-NTN according to 3GPP Rel.17. In cooperation with Samsung, validations were conducted across all three test domains: RF, RRM and PCT. At MWC 2026, visitors will not only be able to experience these test cases but also see a demonstration of Viasat’s test plan for NB-NTN, covering protocol, performance and RF test scenarios.

Industry collaborations to accelerate AI-RAN

AI is becoming an integral part of the RAN, enabling performance optimisation, improved energy efficiency and more autonomous operations. As a member of the AI-RAN Alliance, Rohde & Schwarz continues industry collaboration and provides reliable test equipment for navigating interoperability in this evolving landscape.

  • Rohde & Schwarz and Nokia Bell Labs have collaborated on an AI/ML-based 6G base station radio receiver employing Digital Post Distortion (DPoD) to recover distorted uplink signals. DPoD improves link budget, preserves coverage and reduces the need for dense site deployments, lowering costs. DPoD also reduces mobile device complexity and power consumption. The testbed at the Rohde & Schwarz booth, comprising the R&S SMW200A vector signal generator and the newly launched FSWX signal and spectrum analyzer will showcase the improved performance of Nokia’s AI receiver for uplink signals with different distortion levels.
  • In collaboration with NVIDIA, Rohde & Schwarz will exhibit its latest proof-of-concept, also leveraging digital twin technology and high-fidelity ray tracing. This approach creates a robust framework for testing AI-enhanced base stations for both 5G-Advanced and 6G under realistic propagation conditions. This integration aims to bridge the gap between AI-driven wireless simulations and real-world deployment, facilitating more efficient and accurate testing of next-generation receiver architectures.

Next-generation Wi-Fi experience

Wi-Fi 8 sets new expectations for consistent, ultra-high-reliability and quality connectivity. Designed to handle a growing number of connected devices and demanding applications like XR or industrial IoT, IEEE 802.11bn employs ever more complex MIMO (Multiple-Input, Multiple-Output) scenarios. Rohde & Schwarz enables manufacturers with its solution portfolio, from R&D to production.

  • The CMX500 one-box signalling tester is now equipped with comprehensive Wi-Fi 8 capabilities. The tester’s flexibility and embedded IP test capabilities make it a versatile solution for a broad range of Wi-Fi 8-specific tests, such as dRu (distributed resource unit), introducing distributed resource allocation, and UEQM (unequal modulation) where different MIMO layers use different modulation schemes, as well as 320 MHz channel bandwidth.
  • To navigate the technical complexities of Wi-Fi 8 throughout the entire device lifecycle – from development to production – Rohde & Schwarz will exhibit the CMP180 radio communication tester, designed for testing in non-signalling mode with advanced capabilities and broad bandwidth support. The CMP180 combines two analysers and generators for efficient testing of 2×2 MIMO Wi-Fi 8 devices.
  • For high-end MIMO signal generation and analysis tasks in R&D, Rohde & Schwarz will display the R&S SMW200A vector signal generator and the newly launched FSWX signal and spectrum analyser. With its outstanding standard EVM performance and in combination with its cross-correlation feature, the FSWX discovers details of Wi-Fi 8 signals that have been hidden up to now and offers new margins for optimisation. Its multichannel architecture makes the FSWX well-suited for analysing complex scenarios like multi-user MIMO (MU-MIMO).

Automotive connectivity testing 

Vehicle manufacturers are integrating increasing levels of wireless connectivity to enable new user experiences, safety features and higher levels of autonomous driving. Rohde & Schwarz offers precise test solutions that cover all wireless technologies used in the automotive industry, from 5G and ultra-wideband to C-V2X and GNSS.

  • With NG eCall becoming mandatory for vehicles sold in Europe starting in 2026, Rohde & Schwarz will demonstrate compliance testing capabilities using the CMX500 one-box signalling tester and R&S SMBV100B vector signal generator. The test solution also supports the upcoming Chinese automotive GNSS test standard, GB/T 45086.1 2024, expected to be mandatory for the Automotive Emergency Call System in 2027, with automated testing.
  • Non-terrestrial networks have the potential to provide ubiquitous automotive connectivity and require enhancements to key components such as the chipsets, TCU and antennas. Trade show visitors can discover at MWC 2026 how the company’s comprehensive NTN test solutions can help the automotive industry create the always-connected vehicle.

Solutions for mission-critical communications and spectrum monitoring

Mission-critical communications (MCX) support public safety, first responders and emergency services by providing extremely reliable, low-latency and secure communications even in adverse conditions. Rohde & Schwarz will showcase its integrated solutions for testing devices and mobile networks, facilitating the ongoing migration to 3GPP-compliant broadband mission-critical services.

  • The QualiPoc platform will be demonstrated with new capabilities for MCX testing. This smartphone-based solution allows detailed performance assessment of MCX private and group calls, including measurement of 3GPP-defined MCX KPIs. New features include direct MCX app control and the ability to measure quality of service (QoS) and quality of experience (QoE) for public safety communications. The R&S LCM, an autonomous monitoring probe, and the R&S TSMS8, the fastest network scanner, will also be on display, further expanding capabilities for both business and mission-critical networks.
  • Rohde & Schwarz will also exhibit a protocol conformance test solution to verify that MCX devices and client software implementations adhere to 3GPP specifications.
  • Expanding its spectrum monitoring portfolio, Rohde & Schwarz will launch two new products at MWC Barcelona 2026: These solutions will enable regulatory authorities, network operators and public services in over 100 countries to actively protect the electromagnetic spectrum and address evolving monitoring challenges. The new devices will enhance capabilities in interference hunting and regulatory compliance.

Endpoint security, network visibility and secure network solutions

Robust security solutions deliver seamless and reliable communications experiences. Rohde & Schwarz subsidiaries will also present their innovative solutions supporting the wireless ecosystem.

  • The Rohde & Schwarz Networks & Cybersecurity division, comprising the subsidiaries Rohde & Schwarz Cybersecurity and LANCOM Systems, provides endpoint security, secure networks and high-quality cryptography. With products “Engineered in Germany”, they ensure trustworthy, reliable and secure data transfer, specialising in the public, critical infrastructures, defence, health, retail and SME verticals. At MWC 2026, Rohde & Schwarz Cybersecurity will showcase the Layer 2 encryptor R&S SITLine ETH NG and the R&S ComSec solution enabling secure mobile working with sensitive data on iPhones and iPads. LANCOM Systems will present an overview of its Wi-Fi 7 access point portfolio, the latest 5G router models and firewalls.
  • As networks become more distributed, encrypted and dynamic, network visibility becomes indispensable. At the Rohde & Schwarz booth, visitors will experience how the ready-to-deploy, DPI-powered R&S Probe Observer delivers deep network visibility, precise real-time traffic analytics and actionable intelligence. Developed by ipoque, a Rohde & Schwarz company, this deep packet inspection (DPI) software probe analyses network traffic at the application level, enabling operators to understand, optimise, and control their networks while supporting faster detection, diagnosis and resolution of network and service issues.

Rohde & Schwarz will showcase its comprehensive portfolio of test and measurement and industry solutions at Mobile World Congress 2026 at Fira Gran Via in Barcelona, in hall 5, booth 5A80. Trade magazine editors and press representatives visiting the event are invited to schedule briefings with their press contact at Rohde & Schwarz.

The post R&S drives connections and innovations at MWC Barcelona 2026 appeared first on ELE Times.

EVs, Software and the Grid: Why the Real EV Challenge Is Infrastructure, Not Vehicles

Птн, 02/06/2026 - 07:47

Speaking at the Auto EV Tech Vision Summit 2025, Mohammadsaeed Mombasawala laid bare a reality the EV industry often skirts around—electric vehicles are evolving fast, but the ecosystem supporting them is dangerously lagging.

Opening his address with a provocative question—“Is EV done and dusted?”—Mombasawala was quick to answer it himself:  far from it. Innovation in EVs is accelerating, but the real battleground is no longer the vehicle alone. It is charging infrastructure, grid readiness, and software-defined architectures that will decide the success or failure of the transition.

 Charging Anxiety Will Not Be Solved with AC

According to Mombasawala, EV charging anxiety cannot be addressed with slow, AC charging solutions. The industry is inevitably moving towards high-power DC fast charging, with capacities of  50 kW and above becoming the new norm.

But charging speed alone is not enough. He highlighted the emergence of plug-and-play charging, where vehicles authenticate themselves automatically through preloaded scripts and cloud connectivity—eliminating the need for RFID cards or manual authentication. In this model, the vehicle communicates with the charger via the cloud, pre-authorises itself, and begins charging seamlessly, reflecting the deeper convergence between EVs and software-defined vehicles (SDVs).

Vehicle-to-Grid: Opportunity Born from Crisis

One of the most critical trends Mombasawala pointed to was  Vehicle-to-Grid (V2G) —using EVs not just as consumers of electricity, but as mobile energy sources capable of feeding power back into the grid.

This, he explained, is not just a technological curiosity, but a necessity born from a looming crisis. “I have done the calculation myself,” he noted. If all vehicles in Delhi were replaced with EVs and charged using 50 kW fast chargers, the grid would require 7,000 MW of additional power just to charge vehicles within 5–8 minutes. No grid today is prepared for that kind of load”.

The implication is stark: while EV adoption is racing ahead, grid infrastructure is nowhere close to ready.     

The Grid Is the Real Bottleneck

Mombasawala warned that without serious innovation and investment in electrical infrastructure, a rapid EV transition could destabilise the power system itself.

“If we transition the whole country by 2030 at this pace, the grid will collapse,” he cautioned. The issue is no longer just EV range anxiety—it is national power security. Without infrastructure upgrades, consumers may find themselves unable to charge vehicles and facing power shortages at home.

Electrical engineers, he stressed, have a monumental role ahead—not just in vehicles, but in re-architecting the grid to handle electrified mobility at scale.

Software-Defined Vehicles: Complexity Beneath the Surface

While SDVs are often discussed as sleek, updatable platforms, Mombasawala highlighted the hidden complexity beneath the headlines. Today’s vehicles contain hundreds of ECUs communicating through multiple discrete protocols. The industry urgently needs standardisation, moving towards  Ethernet-based architectures to manage growing data and control demands.

He also pointed to emerging semiconductor trends such as chiplets, where optics and semiconductors are packaged together in a single die—underscoring how vehicle electronics are becoming more sophisticated and tightly integrated.

Why the Cloud Is Non-Negotiable

A recurring theme in his address was the absolute necessity of cloud backends for SDVs. With millions of vehicles requiring continuous updates, feature upgrades, and service enhancements, localised solutions are no longer viable. “There is no red reset button,” he reminded the audience. Without cloud-based services, upgrading and managing vehicle software at scale becomes impossible.

AI, Data Centres and the Limits of In-Vehicle Intelligence

One of the most sobering insights came from Mombasawala’s discussion on  AI in SDVs. Advanced vehicle functions—braking behaviour, acceleration profiles, comfort tuning—will increasingly rely on AI models trained on massive datasets. But these models cannot be trained inside vehicles.

To put scale into perspective, he cited how companies like Meta use around  600,000 GPUs, while Elon Musk’s Grok reportedly uses 800,000 GPUs in a single data centre. Even with such resources, training models can take weeks. Training safety-critical vehicle systems like braking could require  6–8 weeks per iteration, and continuous retraining as new data arrives.

This underscores a key reality: SDVs are as much a data-centre problem as they are an automotive one.

Beyond the Hype

Mombasawala concluded by grounding expectations around SDVs. While the stories sound exciting, real-world vehicle control systems only stabilise through negative feedback loops, making their design and validation far more complex than popular narratives suggest. The EV transition, he implied, will not be won by flashy announcements alone. It will require deep engineering, infrastructure investment, and a sober understanding of system-level constraints.

As the industry pushes ahead, his message was clear: the future of EVs depends not just on better vehicles, but on grids, software, clouds, and engineers rising together.

The post EVs, Software and the Grid: Why the Real EV Challenge Is Infrastructure, Not Vehicles appeared first on ELE Times.

Silicon Shield: Role of Semiconductors in Modern Warfare

Чтв, 02/05/2026 - 13:40

Courtesy: Orbit & Skyline

War has always been part of human history, and true global peace still seems far away. While modern wars may look intense on TV screens, they are generally far less deadly and destructive than what the world has seen in the past. For example, in February 1945, during the bombing of Dresden, Germany, Allied planes dropped unguided bombs aimed at disrupting supply lines. However, the attack triggered a firestorm that killed about 25,000 people and destroyed most of the city, even though it wasn’t a major military target. In contrast, recent conflicts around the world have used far more advanced weapons but resulted in fewer casualties. As military technology evolved from gunpowder to tanks to nuclear weapons, wars became shorter, though still devastating. World War-2 remains the deadliest, with an estimated 70–80 million lives lost.

Today, most battles are fought remotely using advanced weapons that strike deep into enemy territory while keeping one’s own forces out of harm’s way. Modern air battles are fought primarily in the Beyond Visual Range (BVR) domain. The era of vintage fighter planes engaging in close-range dogfights is long gone. Today’s fighter jets are equipped with missiles capable of striking targets nearly 200 km away, far beyond visual sight. To detect and track such distant targets, they rely on advanced AESA (Active Electronically Scanned Array) radars and are supported by airborne AEW&C (Airborne Early Warning and Control) systems that serve as eyes in the sky. These BVR engagements are made possible by sophisticated electronic systems, sensors, and radars, all of which depend on complex semiconductor technologies to enable their advanced capabilities.

The shift in battlefield tactics has reduced civilian harm and damage. This is mainly due to precise, guided weapons powered by semiconductor-based integrated circuits (ICs) that enable smart targeting and real-time tracking. Semiconductors, which revolutionised smartphones and satellites, have also made weapons smarter, faster, and more accurate. These ICs are key to guided missiles, radars, drones, and surveillance systems, processing huge amounts of data quickly for precise targeting, adaptive routes, and secure communication. Such electronics have moved warfare from blunt attacks to focused, strategic strikes.

Rise of Precision Warfare

Warfare has significantly transformed from the close-combat battles of the past to today’s reliance on unmanned systems like missiles and drones. A century ago, weapons such as tanks, rifles, and grenades, primarily mechanical and chemical systems, were designed for maximum destruction, often resulting in heavy casualties with little concern for collateral damage. Modern armies no longer use inaccurate, unguided weapons that can cause accidental damage and political problems, especially with the world watching closely. Such mistakes can harm their goals.

Today, nations prioritise precision and control in military operations through smart, guided weapons. These rely on semiconductor components for navigation, guidance, surveillance, and reconnaissance, enhancing effectiveness and enabling shorter, more focused missions. Precision weapons allow defence forces to deliver targeted strikes with minimal collateral damage, signalling intent without triggering full-scale conflict. This strategy provides several key advantages:

  1. Targeted Messaging: Precision strikes focus on military assets, deterring adversaries without full-scale war.
  2. Escalation Control: Pinpoint accuracy neutralises threats efficiently, avoiding prolonged conflicts.
  3. Reduced Civilian Casualties: Minimal unintended damage preserves legitimacy and prevents escalation.
  4. Strategic Deterrence: Precise attacks discourage aggression by keeping critical assets vulnerable.

Semiconductor Technologies in Advanced Weapon Systems

Semiconductor technologies are essential in modern weapon systems, enhancing precision, range, guidance, navigation, surveillance, and electronic warfare. Here are some examples of how various semiconductor technologies (Silicon, GaN, MEMS, FPGAs, Sensors, etc.) are used in modern weapon systems:

Electronics warfare and Radar: Electronic warfare (EW) and radar systems, built on different semiconductor technologies, form the key driver of battlefield dominance. Integrated circuits (ICs) fabricated using technologies such as CMOS, GaN, GaAs, and SiGe enable operation across low-frequency digital domains and high-frequency bands—from L-band to millimetre-wave. These technologies support critical functions like signal detection, response generation, and countermeasures, enabling fast RF transmission, real-time processing, and agile control essential for modern EW and radar systems.

At the front end, GaN-based high electron mobility transistors (HEMTs) in monolithic microwave integrated circuits (MMICs) deliver high-power RF amplification with good thermal efficiency, key for long-range radar and jamming. These are paired with low-noise amplifiers (LNAs), often made using SiGe BiCMOS or pHEMT processes, to ensure high sensitivity and signal integrity. RF switches and phase shifters, built on SOI or RF MEMS, enable dynamic beam steering in AESA radars, allowing electronic scanning without moving parts.

High-speed analogue-to-digital (ADC) and digital-to-analogue converters (DAC), typically CMOS or bipolar, digitise wideband signals for baseband processing. Radiation-hardened FPGAs and DSP cores handle FFTs, beamforming, and adaptive filtering. PLL synthesisers and VCOs, using SiGe or advanced CMOS, provide low-jitter, frequency-agile clocking critical for threat detection and deception. Together, these semiconductor components form the backbone of radar and EW systems, delivering precision, speed, and resilience in modern electromagnetic warfare.

Guidance and Navigation: In modern missiles and drones, guidance and navigation rely heavily on semiconductor-based components that ensure precision and reliability even in challenging environments. At the core of these systems are inertial navigation systems (INS), which use MEMS accelerometers and gyroscopes to measure motion and orientation without external signals, crucial for operation in GPS-denied conditions. These are often integrated into compact IMUs (Inertial Measurement Units) that fuse data in real time to calculate position and velocity. When GPS signals are available, GNSS receiver chips provide absolute positioning by processing satellite signals, with advanced modules incorporating anti-jamming and anti-spoofing features to maintain accuracy under electronic warfare. Complementing these are digital signal processors (DSPs) and microcontrollers (MCUs), which handle sensor data processing, flight control algorithms, and real-time decision-making. These systems typically utilise semiconductor technologies such as CMOS for GNSS and signal processing ICs, MEMS processes for inertial sensors, and advanced packaging to integrate these functions in compact, rugged modules. Together, these semiconductor components enable autonomous, accurate, and adaptive navigation critical to modern missile and UAV operations.

Explore how we at Orbit & Skyline help global FABs and system integrators by supporting GaN process optimisation, enabling better RF and radar performance. Read our related post on GaN in semiconductors here.

Surveillance and Reconnaissance: Surveillance and reconnaissance systems today use many types of semiconductor technologies for sensing, data processing, and secure communication. CMOS and CCD image sensors are used in electro-optical and night vision equipment. For thermal and FLIR imaging, infrared detectors made from materials like InGaAs, HgCdTe, and InSb are combined with special Readout ICs (ROICs). In radar systems, GaN HEMT chips are used in power amplifiers to send signals efficiently, while SiGe or pHEMT-based LNAs help in receiving weak signals clearly.

High-speed ADCs and DACs made using CMOS or BiCMOS are used to convert radar signals, including those from Synthetic Aperture Radar (SAR), into digital form for further analysis. SAR provides clear images even at night or in bad weather. Systems use FPGAs, SoCs, and AI chips, often made with advanced FinFET technology, to process this data quickly. AI edge ICs make fast, low-power decisions on the spot. DRAM and SRAM store sensor data, while secure MCUs and RF transceivers enable encrypted, high-speed communication. Together, these semiconductor parts help satellites, aircraft, and drones deliver accurate, real-time battlefield information.

Tiny Chips, Big Defence

Semiconductors also serve as a shield for nations, not just for those using advanced weapons, but especially for those that develop them. While many countries can buy defence systems, only a few can design and manufacture the complex semiconductor ICs inside them. These ICs are classified as commercial, military, or space-grade, with increasing reliability and performance demands. Military systems often operate in extreme conditions from -40°C to 150°C, requiring robust, fail-proof chips. Access to such military-grade ICs is restricted by various regulations and export controls. Hence, having a domestic, secure semiconductor manufacturing ecosystem is critical. Without it, countries risk supply-chain disruptions during strategic needs. These ICs function as sensory organs; eyes, ears, and brains of missiles, drones, and UAVs, and are essential for mission success.

A nation with precision-strike capabilities can deter adversaries from engaging in prolonged conflict, something seen very clearly in recent global conflicts. These semiconductor-driven systems form a “Silicon Shield,” reducing the impact of warfare on civilians and infrastructure. Compared to the devastation of the World Wars, recent conflicts have seen far less destruction, underlining the value of precision weapons in enabling restrained and strategic military actions.

The post Silicon Shield: Role of Semiconductors in Modern Warfare appeared first on ELE Times.

Silicon Photonics: The Lightspeed Revolution That Will Transform AI Computing

Чтв, 02/05/2026 - 12:28

Courtesy: Lam Research

Lam Research is setting the agenda for the wafer fabrication equipment industry’s approach to a silicon photonics revolution, driving the breakthroughs in Speciality Technologies that will enable sustainable AI scaling through precision optical manufacturing.

The artificial intelligence boom has created an energy crisis that threatens to consume more electricity than entire nations. As data centres race to keep pace with AI’s insatiable appetite for computational power, technology leaders like Lam are shaping a fundamental shift that could redefine how we think about high-performance computing. One solution lies in replacing the electrical interconnects that have powered computing for decades with something far more efficient: light.

AI’s Energy Crisis: Why Power Demand Is Surging in Data Centres

Goldman Sachs projects a 160% increase in data centre power demand by 2030, reaching 945 terawatt-hours annually — equivalent to Japan’s entire electricity consumption.

The problem runs deeper than software inefficiency. According to Bloomberg, AI training facilities house hundreds of thousands of NVIDIA H100 chips, each drawing 700 watts, nearly eight times the power consumption of a large TV. Combined with cooling systems, some hyperscale facilities now require as much power as 30,000 homes, driving tech companies to seriously consider dedicated nuclear plants.

Source: Nvidia. Apart from average estimates, power data for racks are based on Nvidia specifications. 2025 and later are estimates. AI server racks refer to GPU racks. General-purpose racks refer to CPUs.

The Paradigm Shift

Meeting this challenge requires a fundamental change in how chips are designed and connected. Silicon photonics—using light to transmit data—has the potential to provide dramatic improvements in speed and efficiency over traditional electrical interconnects. Precision optical manufacturing makes this shift possible, enabling scalable processes that can support the next era of energy-efficient, high-performance computing.

Silicon photonics represents a fundamental reimagining of how data moves within computing systems. Instead of pushing electrons through copper wires, this technology uses photons—particles of light—to carry information through silicon waveguides that function like nanoscale fibre optic cables, integrated directly onto chips.

The efficiency gains are dramatic. Optical interconnects consume just 0.05 to 0.2 picojoules per bit of data transmitted, compared to the much higher energy requirements of electrical interconnects over similar distances. As transmission distances increase, even within a single package, the energy advantage of photonics becomes overwhelming.

TSMC has published several research papers on silicon photonics since late 2023. The company has announced public partnerships with NVIDIA to integrate optical interconnect architectures into next-generation AI computing products. Lam is leading the industry’s approach to the transition to silicon photonics. As a technology leader with deep expertise in precision manufacturing, we are defining the roadmap for silicon photonics production, working closely with leading foundries and fabless companies to address the unique challenges presented by optical interconnects.

According to Yole Group, the market for silicon photonics is expected to grow from $95 million in 2023 to more than $863 million in 2029, with a 45% annual growth rate that reflects the technology’s expected rapid commercial adoption.

The Limits of Copper: Why Traditional Interconnects Can’t Scale With AI

At the heart of this energy crisis lies a fundamental bottleneck that has been building for years. While computing performance has advanced at breakneck speed, the infrastructure connecting these powerful processors has not kept pace. Hardware floating-point operations (FLOPS) have improved 60,000-fold over the past two decades, but DRAM bandwidth has increased only 100-fold, and interconnect bandwidth just 30-fold over the same period.

This creates what engineers call the “memory wall,” a constraint where data cannot move between processors and memory fast enough to fully use the available computing power. In AI applications, where massive datasets must flow seamlessly between graphics processors, high-bandwidth memory, and other components, these interconnect limitations become critical performance bottlenecks.

The solution that worked for previous generations—simply shrinking copper interconnects and packing them more densely—is reaching physical limits. As these copper traces become smaller and more numerous, they consume more power, generate more heat, and introduce signal integrity issues that become increasingly difficult to manage. Each voltage conversion in a data centre’s power delivery system introduces inefficiencies, and copper interconnects compound these losses throughout the system.

Modern AI architectures require what engineers call “high access speeds within the stack.” Chips become thinner, interconnects evolve from through-silicon vias (TSVs) to hybrid bonding, and memory modules must connect directly to graphics processors at unprecedented speeds. But when that high-speed memory connection has to traverse copper tracks on a circuit board to reach another processor, much of the bandwidth advantage disappears.

Silicon Photonics Meets AI: Co-Packaged Optics for Next-Gen Performance

Silicon photonics is not entirely new; it has powered telecommunications networks for years through pluggable transceivers that connect data centre racks. These proven systems use silicon photonic dies combined with separate lasers and micro-lens technologies packaged into modules that can be easily replaced if they fail.

But AI’s demands are pushing photonics into uncharted territory. Instead of simply connecting separate systems, the technology must now integrate directly with processors, memory, and other components in what engineers call “co-packaged optics.” This approach promises to bring optical interconnects closer to the actual computation, maximising bandwidth while minimising energy consumption.

The challenge is reliability. While pluggable transceivers can be easily swapped out if they fail, co-packaged optical systems integrate directly with expensive graphics processors and high-bandwidth memory, making them a more reliable option. If an optical component fails in such a system, the repair becomes exponentially more complex and costly. Early implementations from major chip developers are still in pilot phases, carefully assessing long-term reliability before full-scale deployment.

Accelerating Adoption: How Industry Timelines Are Moving Faster Than Expected

Industry roadmaps that once projected capabilities for 2035 are already being met by leading manufacturers. The combination of urgent market need, massive investment, and three decades of accumulated photonics research has created what amounts to a perfect storm for commercialisation.

The implications extend far beyond data centres. As optical interconnects become more cost-effective and established, they have the potential to revolutionise everything from autonomous vehicles to edge computing devices. The same technology that enables sustainable AI scaling could ultimately transform how electronic systems communicate across virtually every application.

Source: Yole Group

The Future of Computing Is Optical Interconnects for Sustainable AI Growth

The question is how quickly it can be implemented and scaled. With leading manufacturers already investing billions and pilot systems entering data centres, the light-speed future of computing is no longer a distant possibility. Companies like Lam, through our customer-centric approach and advanced manufacturing solutions, enable this transformation by providing the precision tools that make commercial silicon photonics possible.

Silicon photonics represents a fundamental technology shift that could determine which companies lead the next phase of the digital revolution. Just as the introduction of copper interconnects enabled previous generations of performance scaling, optical interconnects have the potential to break through the barriers that threaten to constrain AI development.

For an industry grappling with the sustainability challenges of exponential AI growth, silicon photonics offers a path forward that doesn’t require choosing between performance and environmental responsibility. By replacing electrical inefficiency with optical precision, this technology could enable the continued advancement of AI while dramatically reducing its environmental footprint.

The revolution is just beginning, but one thing is clear: the future of high-performance computing is increasingly bright, and Lam is at the centre of it.

The post Silicon Photonics: The Lightspeed Revolution That Will Transform AI Computing appeared first on ELE Times.

AI-Augmented Test Automation at Enterprise Scale

Чтв, 02/05/2026 - 12:00

Courtesy: Keysight Technologies

Enterprise test automation does not break because teams lack tools.

It breaks when browser-level automation is asked to validate systems far beyond the browser.

At enterprise scale, software quality depends on the ability to test entire user journeys across the full technology stack, from web and APIs to desktop, packaged applications, and highly graphical systems, without fragmenting tooling or multiplying maintenance effort.

This distinction explains why Keysight Technologies was positioned as a Leader in the 2025 Gartner Magic Quadrant for AI-Augmented Software Testing Tools, recognised for both Ability to Execute and Completeness of Vision.

Gartner defines AI-augmented software testing tools as solutions that enable increasingly autonomous, context-aware testing across the full software development lifecycle. In practice, that definition only matters if it holds up in complex, regulated enterprises.

One notable deployment is American Electric Power (AEP).

Why Browser-Only Automation Hits a Ceiling at Enterprise Scale

Most enterprises already use Selenium successfully for its intended purpose.

Browser automation works well when:

  • The system under test is web-based
  • Interactions are DOM-driven
  • The scope is limited to UI flows

Problems emerge when enterprises attempt to extend browser-centric automation to validate full end-to-end systems that include:

  • Highly graphical or non-DOM interfaces
  • Desktop or packaged applications
  • Field mobility tools and operational systems
  • Integrated workflows spanning UI, APIs, and backend logic

At that point, teams are forced to stitch together multiple tools, frameworks, and scripts. The result is not resilience-it is complexity, fragmentation, and rising maintenance cost.

The issue is not Selenium.

The issue is using a single-layer tool to validate multi-layer systems.

What Gartner Means by AI-Augmented Software Testing

According to Gartner, the market is moving toward platforms that combine and extend automation capabilities, rather than replacing them.

Modern AI-augmented testing platforms are expected to:

  • Orchestrate testing across UI, API, and visual layers
  • Combine browser automation with image-based and model-based techniques
  • Abstract complexity so teams test behaviour, not implementation details
  • Reduce maintenance through models, self-healing, and intelligent exploration
  • Scale across cloud, on-premises, and air-gapped environments

This is not an argument against existing tools.

It is recognition that enterprise testing requires a unifying layer above them.

Enterprise Reality: Complexity, Scale, and Risk at AEP

AEP operates one of the largest electricity transmission networks in the United States, serving 5.5 million customers across 11 states. Its software landscape includes:

  • Customer-facing web applications
  • Financial and billing systems
  • Highly graphical, map-based field mobility applications

Before modernising its testing approach, AEP faced a common enterprise constraint:

  • Browser automation covered part of the estate
  • Critical operational systems remained difficult to validate
  • Manual testing persisted in high-risk workflows
  • Defects continued to escape into production

The challenge was not adopting another tool.

It was testing the full system end-to-end, consistently, and at scale.

How AEP Scaled Full-Stack, AI-Driven Testing

AEP began where confidence was lowest.

Rather than extending browser automation incrementally, the team selected a highly graphical, map-based field mobility application-a system that sat outside the reach of traditional browser-only approaches.

Using AI-driven, model-based testing, the application was automated end-to-end, validating behaviour across visual interfaces, workflows, and integrated systems.

That success changed internal perception.

As AEP’s Lead Automation Developer and Architect explained, proving that even their most complex system could be tested reliably shifted the conversation from “Can we automate this?” to “How broadly can we apply this approach?”

The key was not replacing existing automation, but extending it into a unified, full-stack testing strategy.

Measured Results: Time, Defects, and Revenue Impact

Once deployed across teams, the outcomes were measurable:

  • 75% reduction in test execution time
  • 65% reduction in development cycle time
  • 82 defects identified and fixed before production
  • 1,400+ automated scenarios executed
  • 925,000 exploratory testing scenarios discovered using AI
  • 55 applications tested across the organisation
  • $1.2 million in annual savings through reduced rework and maintenance

In one instance, AI-driven exploratory testing uncovered 17 critical financial defects that had escaped prior to validation approaches. Resolving those issues resulted in a $170,000 revenue increase within 30 days.

This is not broader coverage for its own sake.

It is risk reduction and business impact.

Empowering Teams Beyond Test Engineers

Another enterprise constraint is who can contribute to quality.

At AEP, non-technical users were able to create tests by interacting with models and workflows rather than code. This reduced dependency on specialist automation engineers and allowed quality ownership to scale with the organisation.

Gartner highlights this abstraction as critical: enterprises need testing platforms that extend participation without increasing fragility.

What Enterprise Leaders Should Look for in AI Testing Platforms

The strategic question is not whether a tool supports Selenium.

The question is whether the platform can:

  • Combine browser automation with visual, API, and model-based testing
  • Validate entire user journeys, not isolated layers
  • Reduce maintenance while expanding coverage
  • Operate across the full enterprise application stack
  • Scale trust before scaling usage

AEP’s experience illustrates Gartner’s broader market view: AI-augmented testing succeeds when it unifies existing capabilities and extends them, rather than forcing enterprises to choose between tools.

The Strategic Takeaway

Enterprise software quality now depends on full-stack validation, not single-layer automation.

Selenium remains valuable. But enterprise testing requires a platform that goes beyond the browser, orchestrates multiple techniques, and scales across real-world complexity.

Independent analyst research defines the direction. Real enterprise outcomes prove what works. AEP’s results show what becomes possible when AI-augmented testing is treated as a strategic, unifying capability. Not a collection of disconnected tools.

The post AI-Augmented Test Automation at Enterprise Scale appeared first on ELE Times.

Murata Launches New Tech Guide to Enhance Power Stability in AI-driven Data Centres

Чтв, 02/05/2026 - 11:40

Murata Manufacturing Co., Ltd. has launched a new technology guide entitled: ‘Optimising Power Delivery Networks for AI Servers in Next-Generation Data Centres.’ Available on the company’s website, the guide introduces specific power delivery network optimisation solutions for AI servers that enhance power stability and reduce power losses across the data centre infrastructure.

The guide addresses the rapid advancement and adoption of AI, a trend driving the continuous rollout of new data centres worldwide. As the industry moves toward higher voltage operations and increased equipment density, the resulting increase in overall power consumption has made stable power delivery a critical business issue for data centre operators. Consequently, the guide focuses on power circuit design for data centres, providing a detailed overview of market trends, evolving technologies in power delivery, and the key challenges the sector currently faces.

To assist engineers and designers, the guide is structured to provide a market overview that breaks down power consumption and technology trends within power lines. It further addresses market challenges and solutions by examining key considerations in power-line design and exploring how the evolution of power placement architectures can facilitate power stabilisation and loss reduction.

Murata supports these architectural improvements with a broad product lineup that addresses advanced and evolving power delivery methods, including multilayer ceramic capacitors (MLCC), silicon capacitors, polymer aluminium electrolytic capacitors, inductors, chip ferrite beads, and thermistors. Furthermore, the company provides comprehensive design-stage support, using advanced analysis technologies to assist with component placement and selection. Backed by a robust global supply and support network, Murata continues to deliver tangible value by solving power-related challenges in data centres.

You can download the full technology guide here: Optimising Power Delivery Networks for AI Servers in Next-Generation Data Centres 

The post Murata Launches New Tech Guide to Enhance Power Stability in AI-driven Data Centres appeared first on ELE Times.

Vishay Intertechnology launches New Commercial and Automotive Grade Power Inductors

Чтв, 02/05/2026 - 09:05

Vishay Intertechnology, Inc. introduced four new power inductors in the 2.0 mm by 1.6 mm by 1.2 mm 0806 and 3.2 mm by 2.5 mm by 1.2 mm 1210 case sizes. The commercial IHLL-0806AZ-1Z and IHLL-1210AB-1Z and Automotive Grade IHLP-0806AB-5A and IHLP-1210ABEZ-5A achieve the same performance as the next-smallest competing inductor in 11 % (1210) and 64 % (0806) smaller footprints, while offering higher operating temperatures, a wider range of inductance values, and lower DCR for increased efficiency.

Offering inductance values from 0.24 µH to 4.70 µH and typical DCR down to 6.6 mΩ, the terminals of the IHLL-0806AZ-1Z and IHLL-1210AB-1Z are plated on the bottom only, enabling a smaller land pattern for more compact board spacing. The terminals of the IHLP-0806AB-5A and IHLP-1210ABEZ-5A are plated on the bottom and sides, allowing for the formation of a solder fillet that adds mounting strength against great mechanical shock, while simplifying solder joint inspection. The AEC-Q200 qualified devices provide reliable performance up to +165 °C, which is 10 °C higher than the closest competing composite inductor, and typical DCR down to 15.0 mΩ.

Delivering improved performance over ferrite-based technologies, all four devices feature a robust powdered iron body that completely encapsulates their windings — eliminating air gaps and magnetically shielding against crosstalk to nearby components — while their soft saturation curve provides stability across the entire operating temperature and rated current ranges. Packaged in a 100 % lead (Pb)-free shielded, composite construction that reduces buzz to ultra-low levels, the inductors offer high resistance to thermal shock, moisture, and mechanical shock, and handle high transient current spikes without saturation.

RoHS-compliant, halogen-free, and Vishay Green, the Vishay Dale devices released today are designed for DC/DC converters, noise suppression, and filtering in a wide range of applications. The IHLP-0806AB-5A and IHLP-1210ABEZ-5A are ideal for automotive infotainment, navigation, and braking systems; ADAS, LiDAR, and sensors; and engine control units. The IHLL-0806AZ-1Z and IHLL-1210AB-1Z are intended for CPUs, SSD modules, and data networking and storage systems; industrial and home automation systems; TVs, soundbars, and audio and gaming systems; battery-powered consumer healthcare devices; medical devices; telecom equipment; and precision instrumentation.

Device Specification Table:

Series

IHLL-0806AZ-1Z

IHLP-0806AB-5A

IHLL-1210AB-1Z

IHLP-1210ABEZ-5A

Inductance @ 100 kHz (μH)

0.24 to 4.70

0.22 to 0.47

0.24 to 4.70

0.47 to 4.70

DCR typ. @
25 °C (mΩ)

16.0 to 240.0

15.0 to 21.0

6.6 to 115.0

18.0 to 150.0

DCR max. @
25 °C (mΩ)

20.0 to 288.0

18.0 to 25.0

10.0 to 135.0

22.0 to 180.0

Heat rating current typ. (A)(¹)

1.3 to 6.3

4.6 to 5.8

2.3 to 9.2

1.8 to 5.1

Saturation current typ. (A)(²)

1.5 to 6.5

4.5 to 5.1

2.5 to 9.0

2.0 to 6.5

Saturation current typ. (A)(³)

1.8 to 7.2

5.4 to 7.5

2.9 to 11.5

2.5 to 8.2

Case size

0806

0806

1210

1210

Temperature range (°C)

-55 to +125

-55 to +165

-55 to +125

-55 to +165

AEC-Q200

No

Yes

No

Yes

 (¹) DC (A) that will cause an approximate ΔT of 40 °C
(²) DC (A) that will cause L0 to drop approximately 20 %
(³) DC (A) that will cause L0 to drop approximately 30 %

The post Vishay Intertechnology launches New Commercial and Automotive Grade Power Inductors appeared first on ELE Times.

Loom Solar Introduces Revolutionary, Scalable CAML BESS Solution up to 1 MWh to Replace Diesel Generators for C&I Sector

Чтв, 02/05/2026 - 08:25

Loom Solar, one of India’s leading solar manufacturing companies, announced the launch of its scalable 125kW/261kWh CAML Battery Energy Storage System (BESS) up to 1MWh, a next-generation solution designed to deliver uninterrupted, seamless power tothe  Commercial and Industrial (C&I) sector, significantly reducing production losses caused by power outages.

Unlike conventional diesel generator-based systems, which typically involve switch-over downtimes ranging from 30 seconds to 3 minutes, Loom Solar’s scalable 125kW/261kWh BESS ensures instantaneous power availability, eliminating operational disruptions in critical industrial processes. The system is engineered for a cleaner, quieter, and safer microgrid application that addresses low-voltage situations and power cuts while delivering continuous power for over two hours, with deep-discharge capability, making it a reliable alternative for businesses that demand high uptime and operational efficiency.

With a lifecycle of up to 6,000 charge–discharge cycles, the scalable 125kW/261kWh BESS offers long-term durability and superior economic value.

Developed through Loom Solar’s strong focus on in-house research and development, and validated through rigorous product testing facilities, the solution reflects the company’s commitment to innovation and reliability. The system is IoT-enabled and compatible with connected energy ecosystems, allowing real-time monitoring, intelligent energy management, and seamless integration with renewable power sources such as solar.

Commenting on the launch, Amod Anand, Co-Founder and Director, Loom Solar, said, “The scalable 125kW/261kWh BESS is a solution-led product designed specifically for India’s C&I sector, where even a few seconds of downtime can translate into significant losses. Our focus has been to replace reactive power backup with intelligent, seamless energy continuity. This solution not only ensures uninterrupted operations but also helps businesses optimise energy costs and move closer to energy independence through renewable integration.”

With this launch, Loom Solar strengthens its position as a key enabler of India’s energy transition, offering integrated solar and energy storage solutions that support energy security, sustainability, and long-term resilience for businesses.

The post Loom Solar Introduces Revolutionary, Scalable CAML BESS Solution up to 1 MWh to Replace Diesel Generators for C&I Sector appeared first on ELE Times.

Infineon strengthens its leading position in sensors acquiring non-optical analogue/mixed-signal sensor portfolio from ams OSRAM

Срд, 02/04/2026 - 12:07

Infineon Technologies AG is expanding its sensor business with the acquisition of the non-optical analogue/mixed-signal sensor portfolio from ams OSRAM Group. The two companies have entered into an agreement for a purchase price of €570 million on a debt-free and cash-free basis. With the planned investment, Infineon will strengthen its position as a leader in sensors for automotive and industrial markets through a complementary portfolio and expand its product range in medical applications. The acquired business is expected to generate around €230 million in revenue in calendar year 2026 and will support Infineon’s profitable growth. The transaction will be accretive to earnings-per-share immediately upon closing, with future synergies enabling substantial additional value creation. As part of the transaction, around 230 employees with expertise in research and development (R&D) and business management will join Infineon. The agreement includes a multi-year supply agreement with ams OSRAM.

“The acquired business is a perfect strategic fit for Infineon and complements our strong offering in the analogue and sensor space. We will be able to provide our customers with even more comprehensive system solutions,” says Jochen Hanebeck, CEO of Infineon. “I am convinced that this is an outstanding technological, commercial and cultural match, generating growth opportunities in our current target markets as well as in emerging areas like humanoid robotics.”

The overall transaction is structured as a fabless asset deal covering sensor products, R&D capabilities, intellectual property and test & lab equipment. The transaction is subject to customary closing conditions, including regulatory approvals, and is expected to close in the second quarter of calendar year 2026. Infineon will fund the acquisition with additional debt, as part of its general corporate financing plans.

Sensors are the link between the physical and the digital world, as they detect and convert signals such as movement, sound, light waves, temperature and even heartbeat and strain into processible data. They are at the core of a wide array of applications like software-defined vehicles, health trackers, and physical AI applications such as humanoid robots. The market potential of the sensor and radio frequency markets is projected to exceed $20 billion by 2027.

The acquired Mixed Signal Products business will add leading medical imaging and sensor interfaces to the portfolio of Infineon, including X-ray solutions and sensors used for valve control, building control technology and metering. The Positioning & Temperature Sensors assets will strengthen Infineon’s high-precision position, capacitive and temperature sensing for automotive, industrial and medical applications, such as chassis position sensing and hands-on detection in vehicles, angle and position sensing for robotics and glucose monitoring.

The acquisition fully supports Infineon’s strategy to grow its sensor business. Infineon established its Sensor Units & Radio Frequency (SURF) unit within its Power & Sensor Systems (PSS) division in January 2025. This aligns with the strategy to offer customers comprehensive system solutions through a powerful, interlinked portfolio in “analogue & sensors”, “power” and “control & connectivity”.

The post Infineon strengthens its leading position in sensors acquiring non-optical analogue/mixed-signal sensor portfolio from ams OSRAM appeared first on ELE Times.

The Rare Earths Catch-22: Why It Exists and How It Can Be Fixed

Срд, 02/04/2026 - 09:09

Speaking at the Auto EV Tech Vision Summit 2025, Bhaktha Keshavachara, CEO, Chara Technologies, highlights the Rare Earth challenges as faced by the world today and what potential policies can resolve them!

As the world strides towareds more sustainable solutions, the technologies we use become more rare-earth dependent, ranging from batteries to motors and the magnets used in the motors. To couple this phenomenon, a simultaneous energy transition is also taking shape. We are gradually moving towards achieving our energy goals from electrons, as compared to hydrocarbons previously, especially in transportation. This necessitates the need to locate supply chains in a stable region or wholly become self-sustainable in the raw materials, which are Rare Earths, the 17 elements put separately in the periodic table, as Bhaktha Keshavachara, CEO, Chara Technologies, puts it!

With Rare Earths, the global catch-22 lies for two specific reasons, namely: 

  1. It is expensive to buy 
  2. It is hazardous to extract   

Since these materials are critical for our future, or the future dominated by Electric technologies like EVs, E-Buses, etc – It becomes imperative for us to search for ways to locate them in stable regions or make oneself self-sufficient in their production, or simply find ways. Let’s see what Bhaktha had to say about it! 

Start Mining or Find Alternatives

“We have to start mining and extraction,” Bhaktha reiterates as he presents his first solution for the Rare-Earth catch-22.  He goes on to recount the strategies adopted by the nations globally, including the US, which has interestingly reopened its mines in California for rare-earth minerals.  Further, he underlines the ongoing global efforts to find alternative materials to build rare-earth magnets without using rare earths. He underlines NIRON from the US, which is experimenting with iron nitride magnets. He also points to Europe’s efforts towards finding an alternative in potassium-strontium magnets. 

The problem with rare-earth mining is the hazardous nature of the process that leaves populations and people cancer-ridden for a long time. “If you see pictures on the net of the west coast of China, actually in central China, there are like cancer villages,” Bhaktha recounts. 

Alternative Motor Technologies or Materials

Further, he suggests using alternative motor technologies to reduce the materials component of rare earths in the overall product.  He refers to the various motor types in the same continuation, including electrically excited synchronous motors (EESM), induction motors (IM), and synchronous reluctance motors (SynRM). He also touches upon the light rare-earth materials, calling for more use of them as opposed to the heavy rare-earth materials that China holds a stronghold over, as he mentioned in his address 

India’s Situation 

Talking about India’s situation, Bhkatha says, “We have rare earths, but not all the 17 rare earths, but still we can do with whatever we have, and potentially we can import ore which has dysprosium and other rare earth materials.” He also recounts some past events wherein global price fluctuations anchored by China led to two big companies in India dropping projects of magnet manufacturing as the project suddenly became unviable in business terms. 

In the same sequence, he reiterates the example of the US government that has stepped in to cap the minimum prices for the magnets irrespective of the global market fluctuations, to basically support the industry and also enable localisation of the technology and materials. 

National efforts, Global Repercussions  

In the midst of all these challenges, Bhaktha reaffirms his determination to face the storm in the face, calling upon the industry to innovate for the better. He says, “I think if we do the innovation and take the leadership role in prioritizing this, we not only have a huge opportunity to do something new in India, but there is a huge opportunity to export to the rest of the world because the rare-earth problem is a global problem.”  

The post The Rare Earths Catch-22: Why It Exists and How It Can Be Fixed appeared first on ELE Times.

New Power Module Enhances AI Data Centre Power Density and Efficiency

Срд, 02/04/2026 - 08:13

The increasing AI and high-performance computing workloads demand power solutions that combine efficiency, reliability and scalability. Integrated power modules help streamline design, reduce energy use and deliver the stable performance required for advanced data centres. Microchip Technology announces the launch of the MCPF1525 Power Module, a highly integrated device with a 16V Vin buck converter that can deliver 25A per module, stackable up to 200A. The MCPF1525 enables higher power delivery within the same rack space and is combined with a programmable PMBus and I2C controls. This device is designed to power the latest generation of PCIe switches and high-performance compute MPU applications needed for AI deployments.

The MCPF1525 is packaged in an innovative vertical construction that maximises board space efficiency and can offer up to a 40% board area reduction when compared to other solutions. The compact power module is approximately 6.8 mm x 7.65 mm x 3.82 mm, making it an optimal solution for space-constrained AI servers.

For increased reliability, the MCPF1525 includes multiple diagnostic functions reported over PMBus, including over-temperature, over-current and over-voltage protection to minimise undetected faults. With a thermally enhanced package, the device is engineered to work within an operating junction temperature range of -40°C to +125°C. An on-board embedded EEPROM allows users to program the default power-up configuration.

“By leveraging Microchip’s comprehensive solutions, including PCIe Switchtec technology, FPGAs, MPUs and Flashtec NVMe controllers, the MCPF1525 power module can help customers achieve the system efficiency, reliability and scalability required for high-performance data centre and industrial computing applications,” said Rudy Jaramillo, vice president of Microchip’s analogue power and interface division. “Seamless integration across Microchip’s portfolio simplifies development and lowers risk, helping designers accelerate time-to-market.”

The MCPF1525 features a customised integrated inductor for low conducted and radiated noise, enhancing signal integrity, data accuracy and reliability of high-speed computing, helping reduce repeated data transmissions that waste valuable system power and time.

The post New Power Module Enhances AI Data Centre Power Density and Efficiency appeared first on ELE Times.

Budget 2026–27: How a Rare Earth Corridor Can Power India’s Electronics & Automotive Manufacturing Push

Втр, 02/03/2026 - 13:47

The Union Finance Minister Smt Nirmala Sitharaman unveils the Union Budget 2026-27 on the floor of the house, amidst India’s stride to capitalise on its electronics & IT sector while boosting its defence expenditure and technology both. The union budget proposes to establish a Rare-Earth Corridor in mineral-rich states, including Tamil Nadu, Andhra Pradesh, Odisha, and Kerala. The move is all set to benefit the automotive and electronics industriesby emans of ensuring a safe and self-sustainable rare-earth corridor to complement India’s upward momentum in automobile and electronics manufacturing.  

The Union Budget 2026-27 makes some to-the-point and striking announcements to support the upward momentum in self-enablement and security within and across the borders, both. 

What are Rare-earths Important? 

As the world moves toward more sustainable solutions, the technologies driving this shift are becoming increasingly dependent on rare earth materials—from batteries and electric motors to the magnets that power them. Parallel to this trend, a global energy transition is also gaining momentum. With electric technologies such as EVs and e-buses set to dominate the future, these materials have become strategically critical. This makes it imperative to secure their supply—either by sourcing them from geopolitically stable regions or by building self-sufficiency across the rare earth value chain.

Quite Rare “Rare-Earth” Materials 

As the name itself suggests, but geopolitics makes them far rarer. The challenge with rare-earth mining lies not only in scarcity but in the hazardous nature of the extraction process itself. Poorly regulated mining and processing have left long-term environmental and human health consequences, with communities exposed to toxic by-products for decades. 

Recalling the impact, Bhaktha Keshavechara, CEO, Chara Technologies, pointed in his address at Auto EV TVS 2025 to the regions in central China where entire settlements have been affected—often referred to as “cancer villages”—underscoring the severe social cost embedded in the global rare-earth supply chain. Conversely, the recycling of the material is way costlier than the virgin material itself, making the equations way more difficult. 

Easier logistics, Easier access 

A dedicated corridor for rare earths would certainly make the rare-earth materials way more accessible across all of India, for either the electronics or the automotive industry. This ease in accessibility will manifest itself in the form of more research, more interaction with the materials to find more solutions, making the technology and development process accessible and grounded in the nation itself. Thai will pave the way for more innovations in this field.  

National Efforts Unlocking Global Opportunity 

The proposal would empower India’s position in the rare-earth value chain globally, as the problem currently engulfs the entire world. The initiative is expected to generate stronger local economies and enhance R&D capacity. This will integrate India more deeply into global advanced‑materials value chains, as dedicated access would make even exports easier. 

Coupled with complementary international partnerships and institutional reforms, further ensure resilient access to critical minerals. With coordinated domestic and global initiatives, India is gradually positioning itself as a reliable and competitive player in advanced materials value chains.

The post Budget 2026–27: How a Rare Earth Corridor Can Power India’s Electronics & Automotive Manufacturing Push appeared first on ELE Times.

Element Solutions Completes Acquisition of Micromax Business

Втр, 02/03/2026 - 12:59

Element Solutions Inc (ESI) today announced the completion of its acquisition of the Micromax conductive pastes and inks business, effective February 2, 2026. Micromax will operate within MacDermid Alpha Electronics Solutions, part of ESI’s Electronics segment.

The acquisition strengthens ESI’s position as a leading global supplier of specialised electronic materials serving the electronics design and manufacturing industry. By combining Micromax’s expertise in conductive pastes, inks, and ceramic materials with MacDermid Alpha’s broad electronics materials portfolio, ESI expands its ability to support innovation across advanced and high-reliability electronics applications.

“The acquisition of Micromax is a strong strategic fit for ESI and reinforces our focus on high-value, technology-driven businesses,” said Richard FrickePresident, Electronics, adding, “Micromax’s differentiated materials and long-standing customer relationships further strengthen our Electronics segment and expand our ability to support innovation across the electronics manufacturing ecosystem.”

Building Breakthroughs by Leveraging Our Combined Expertise

With Micromax now part of MacDermid Alpha Electronics Solutions, customers gain access to a broader, highly complementary portfolio of advanced electronic materials designed to enable performance, reliability, and design flexibility. The combined portfolio includes thick-film conductive inks compatible with polymer films, glass tapes, metals, and ceramics, as well as Low Temperature Co-Fired Ceramic (LTCC) materials that support high multilayer circuit density and withstand extreme operating environments.

These materials are used in critical electronic functions, such as circuitry, interconnection, and packaging and serve a wide range of end-use markets, including automotive and advanced mobility, telecommunications/5G infrastructure, consumer electronics, aerospace and defence, and medical devices.

“Micromax brings highly complementary technologies and deep materials expertise that align naturally with MacDermid Alpha’s mission,” said Bruce MoloznikSr. VP Business Integration, MacDermid Alpha Electronics Solutions. “Together, we are building breakthroughs that help customers accelerate innovation, deliver high reliability, and compete with confidence in demanding electronics markets.”

The post Element Solutions Completes Acquisition of Micromax Business appeared first on ELE Times.

Worldwide IT Spending to Grow 10.8% in 2026, Amounting $6.15 Trillion, Forecasts Gartner

Втр, 02/03/2026 - 12:46

Worldwide IT spending is expected to reach $6.15 trillion in 2026, up 10.8% from 2025, according to the latest forecast by Gartner, Inc., a business and technology insights company.

AI infrastructure growth remains rapid despite concerns about an AI bubble, with spending rising across AI‑related hardware and software,” said John-David Lovelock, Distinguished VP Analyst at Gartner. “Demand from hyperscale cloud providers continues to drive investment in servers optimised for AI workloads.”

Server spending is projected to accelerate in 2026, growing 36.9% year-over-year. Total data centre spending is expected to increase 31.7%, surpassing $650 billion in 2026, up from nearly $500 billion the previous year (see Table 1).

Table 1. Worldwide IT Spending Forecast (Millions of U.S. Dollars) 

   

2025 Spending

 

2025 Growth (%)

 

2026 Spending

 

2026 Growth (%)

Data Centre Systems 496,231 48.9 653,403 31.7
Devices 788,335 9.1 836,417 6.1
Software 1,249,509 11.5 1,433,633 14.7
IT Services 1,717,590 6.4 1,866,856 8.7
Communications Services  

1,303,651

 

3.8

 

1,365,184

 

4.7

Overall IT 5,555,316 10.3 6,155,493 10.8

Source: Gartner (February 2026)

Software Spending Shows Second-Highest Growth Potential Despite Lower Revision

Software spending growth for 2026 has been slightly revised downward to 14.7%, from 15.2% for both application and infrastructure software.

“Despite the modest revision, total software spending will remain above $1.4 trillion,” said Lovelock. “Projections for generative AI (GenAI) model spending in 2026 remain unchanged, with growth expected at 80.8%. GenAI models continue to experience strong growth, and their share of the software market is expected to rise by 1.8% in 2026.”

Device Growth Expected to Slow in 2026

Shipments of mobile phones, PCs, and tablets continue to grow steadily. Total spending on devices is projected to reach $836 billion in 2026. However, market-demand constraints will slow growth to 6.1% in 2026.

“This slowdown is largely due to rising memory prices, which are increasing average selling prices and discouraging device replacements,” said Lovelock. “Additionally, higher memory costs are causing shortages in the lower end of the market, where profit margins are thinner. These factors are contributing to more muted growth in device shipments.”

The post Worldwide IT Spending to Grow 10.8% in 2026, Amounting $6.15 Trillion, Forecasts Gartner appeared first on ELE Times.

R&S reshapes mid-range market with new 44 GHz FPL spectrum analyzer and 40 MHz real-time analysis

Втр, 02/03/2026 - 09:42

The new R&S FPL1044 from Rohde & Schwarz offers a frequency range of 10 Hz to 44 GHz. It is the first and only spectrum analyser in this price range on the market to reach the 44 GHz milestone, drastically lowering the entry barrier for high-frequency testing.

Setting itself apart within the FPL family, the FPL1044 is the only model to offer a DC coupling option, expanding the measurable frequency range starting from as low as 10 Hz. This feature ensures maximum versatility for analysing signals from extremely low frequencies up to the critical Ka-band. The analyser maintains the compact, lightweight dimensions and robust design of the FPL family, ensuring portability and efficient use of bench space. It features a standard 2.92 mm male input connector for reliable high-frequency measurements.

Launching simultaneously with the R&S FPL1044 is the new R&S FPL1-K41R 40 MHz real-time spectrum analysis option. This upgrade is compatible with all frequency variants of the FPL family, empowering users across the entire product line with the ability to capture and analyse even the shortest events with a Probability of Intercept (POI) time as low as 4.2 µs.

For the new R&S FPL1044, this means 40 MHz real-time frequency analysis is now available up to 44 GHz, providing a complete, affordable solution for the challenging world of high-frequency signal monitoring and component testing.

Targeting critical high-frequency applications

The frequency range of 26.5 GHz to 44 GHz is vital for the aerospace & defence industry, as well as the components industry and for research. It is used for satellite links, radar, radio navigation, earth observation and radio astronomy. Key applications for the R&S FPL1044 are testing satellite and radar systems and components, production quality control of high-frequency components (e.g., filters, amplifiers, travelling-wave tubes), as well as on-site repair and maintenance.

The post R&S reshapes mid-range market with new 44 GHz FPL spectrum analyzer and 40 MHz real-time analysis appeared first on ELE Times.

AR and VR’s Next Breakthrough Will Come From Integration, Not Displays: Vijay Muktamath, Sensesemi Technologies

Втр, 02/03/2026 - 09:23

Augmented and virtual reality have long promised immersive digital experiences, but their journey from spectacle to everyday utility has been slower than expected. While advances in graphics, GPUs, and rendering engines have pushed visual realism forward, the real barriers now lie deeper—inside the wireless links, materials, packaging, and system architectures that must quietly work in unison to make AR and VR practical, portable, and reliable.

In an exclusive interaction with ELE Times, Vijay Muktamath, CEO & Founder of Sensesemi Technologies, offers a grounded view of what truly limits AR and VR today—and what will ultimately enable their mainstream adoption. His insights come at a time when Sensesemi has raised ₹250 million in seed funding, aimed at accelerating the development of integrated edge-AI chips for industrial, automotive, and medical applications.

Why AR and VR Still Feel Heavy? 

One of the most visible challenges of current AR and VR systems is their bulk. Headsets remain tethered, power-hungry, and constrained—symptoms of a deeper issue rather than mere design immaturity.

According to Muktamath, the root of the problem lies in data movement. “AR and VR demand extremely high data rates,” he explains. “Millimeter-wave technologies in the gigahertz range work well for browsing or radar applications, but once you move into 4K and 8K immersive content, the bandwidth requirement pushes you into terahertz.”

Terahertz frequencies offer vast bandwidth over short distances, making them uniquely suited for point-to-point communication, including intra-device and inter-chip data transfer. This becomes critical as conventional PCB traces introduce losses that are increasingly difficult to manage at higher frequencies.

In other words, as visuals improve, connectivity—not compute—becomes the bottleneck.

Terahertz Is Powerful—but Unforgiving

Yet terahertz is far from a silver bullet. While it unlocks unprecedented data rates, it also introduces a new class of engineering challenges. “Power, noise, packaging—these are all issues,” Muktamath says. “But the real bottleneck is system-level integration.”

Terahertz systems demand precise alignment, tight thermal control, stable clock distribution, and, most critically, spatial coherence. Even minor deviations can degrade RF performance. Testing compounds the problem: lab setups for terahertz systems are bulky, complex, and expensive, making cost control a serious concern for commercial deployment. “Eventually, all of this reflects in the economics,” he adds. “And economics decides whether a technology scales.”

Where CMOS Quietly Takes Over

If terahertz dominates the conversation around connectivity, CMOS quietly anchors the compute backbone of AR and VR systems. “Once the RF signal is converted to digital, that’s where CMOS shines,” Muktamath explains. “Real-time processing, control, power efficiency—this is where CMOS is extremely mature.”

This is also where Sensesemi positions itself. The company focuses on integrated compute and control architectures, enabling on-device processing while supporting lower-bandwidth wireless technologies such as Wi-Fi and BLE for system control and coordination. However, AR and VR systems are not monolithic. “The future architecture will be heterogeneous,” he says. “Terahertz front ends may use silicon-germanium, while compute runs on CMOS. The challenge is integrating these into a single, compact, reliable system.”

Packaging: The Hidden Constraint

That integration challenge places advanced packaging at the center of AR and VR’s evolution. “At terahertz frequencies, even tiny interconnects inside substrates matter,” Muktamath notes. “When you integrate different materials, interfaces and bonding become critical.”

Multi-chip modules, 3D heterogeneous integration, and new interface technologies will determine how efficiently data moves across the system. For AR and VR, where space is at a premium and performance margins are tight, packaging is no longer a back-end consideration—it is a design driver. “This is where the next wave of innovation will come from,” he adds.

Like most deep technologies, AR and VR face a familiar adoption dilemma: performance versus cost. “Today, the world is cost-sensitive,” Muktamath says. “But over time, users start valuing reliability, security, and performance over cheaper alternatives.” He believes AR and VR will reach a similar inflection point—where the value delivered outweighs the premium—much like smartphones and AI systems did in their early days.

Healthcare: Where AR and VR Become Indispensable

While consumer adoption may take longer, Muktamath sees healthcare as the sector where AR and VR will first become indispensable. “In medical robotics and assisted surgeries, AR and VR can overlay real-time insights directly into a surgeon’s field of view,” he explains. “Even if devices are bulky initially, the value they offer is immediate.”

By reducing cognitive load and improving precision, AR and VR can transform how complex procedures are performed—accelerating both adoption and technological refinement.

India’s Moment—If It Thinks Long-Term

On India’s role in this evolving landscape, Muktamath strikes a cautiously optimistic tone. “India started late in deep-tech R&D, but we have started,” he says. “What we need now is patience—capital, policy, and vision that spans decades, not quarters.”

He emphasizes that India’s talent pool is strong, but better alignment is needed between academia, industry, and government to move from research to productization. “Innovation doesn’t end with a paper,” he adds. “It ends with a product that the world uses.”

From Science Fiction to System Engineering

As the conversation draws to a close, Muktamath reflects on how quickly perception can change.

“AR and VR may feel like science fiction today,” he says. “But in the next three to four years, they will be very real.” What will decide that future is not just better visuals or faster processors, but the invisible technologies—terahertz links, CMOS compute, advanced packaging, and system-level coherence—that quietly work together behind the scenes.

The post AR and VR’s Next Breakthrough Will Come From Integration, Not Displays: Vijay Muktamath, Sensesemi Technologies appeared first on ELE Times.

How AI and ML Became Core to Enterprise Architecture and Decision-Making

Втр, 02/03/2026 - 09:00

By: Saket Newaskar, Head of AI Transformation, Expleo

Enterprise architecture is no longer a behind-the-scenes discipline focused on stability and control. It is fast becoming the backbone of how organisations think, decide, and compete. As data volumes explode and customer expectations move toward instant, intelligent responses, legacy architectures built for static reporting and batch processing are proving inadequate. This shift is not incremental; it is structural. In recent times, enterprise architecture has been viewed as an essential business enabler.

The global enterprise architecture tools market will grow to USD 1.60 billion by 2030, driven by organisations aligning technology more closely with business outcomes. At the same time, the increasing reliance on real-time insights, automation, and predictive intelligence is pushing organisations to redesign their foundations. Also, artificial intelligence (AI) and machine learning (ML) are not just optional enhancements. They have become essential architectural components that determine how effectively an enterprise can adapt, scale, and create long-term value in a data-driven economy.

Why Modernisation Has Become Inevitable

Traditional enterprise systems were built for reliability and periodic reporting, not for real-time intelligence. As organisations generate data across digital channels, connected devices, and platforms, batch-based architectures create latency that limits decision-making. This challenge is intensifying as enterprises move closer to real-time operations. According to IDC, 75 per cent of enterprise-generated data is predicted to be processed at the edge by 2025. It highlights how data environments are decentralising rapidly. Legacy systems, designed for centralised control, struggle to operate in this dynamic landscape, making architectural modernisation unavoidable.

 AI and ML as Architectural Building Blocks

 AI and ML have moved from experimental initiatives to core decision engines within enterprise architecture. Modern architectures must support continuous data pipelines, model training and deployment, automation frameworks, and feedback loops as standard capabilities. This integration allows organisations to move beyond descriptive reporting toward predictive and prescriptive intelligence that anticipates outcomes and guides action.

In regulated sectors such as financial services, this architectural shift has enabled faster loan decisions. Moreover, it has improved credit risk assessment and real-time fraud detection via automated data analysis. AI-driven automation has also delivered tangible efficiency gains, with institutions reporting cost reductions of 30–50 per cent by streamlining repetitive workflows and operational processes. These results are not merely the outcomes of standalone tools. Instead, they are outcomes of architectures designed to embed intelligence into core operations.

Customer Experience as an Architectural Driver

 Customer expectations are now a primary driver of enterprise architecture. Capabilities such as instant payments, seamless onboarding, and self-service have become standard. In addition, front-end innovations like chatbots and virtual assistants depend on robust, cloud-native and API-led back-end systems that deliver real-time, contextual data at scale. While automation increases, architectures must embed security and compliance by design. Reflecting this shift, the study projects that the global market worth for zero-trust security frameworks will exceed USD 60 billion annually by 2027. As a result, this will reinforce security as a core architectural principle.

 Data Governance and Enterprise Knowledge

 With the acceleration of AI adoption across organisations, governance has become inseparable from architecture design. Data privacy, regulatory compliance, and security controls must be built into systems from the outset, especially as automation and cloud adoption expand. Meanwhile, enterprise knowledge, proprietary data, internal processes, and contextual understanding have evolved as critical differentiators.

 Grounding AI models in trusted enterprise knowledge improves accuracy, explainability, and trust, particularly in high-stakes decision environments. This alignment further ensures that AI systems will support real business outcomes rather than producing generic or unreliable insights.

Human Readiness and Responsible Intelligence

Despite rapid technological progress, architecture-led transformation ultimately depends on people. Cross-functional alignment, cultural readiness, and shared understanding of AI initiatives are imperative for sustained adoption. Enterprise architects today increasingly act as translators between business strategy and intelligent systems. Additionally, they ensure that innovation progresses without compromising control.

Looking ahead, speed and accuracy will remain essential aspects of enterprise architecture. However, responsible AI will define long-term success. Ethical use, transparency, accountability, and data protection are becoming central architectural concerns. Enterprises will continue redesigning their architectures to be scalable, intelligent, and responsible for the years to come. Those that fail to modernise or embed AI-driven decision-making risk losing relevance in an economy where data, intelligence, and trust increasingly shape competitiveness.

The post How AI and ML Became Core to Enterprise Architecture and Decision-Making appeared first on ELE Times.

Engineering the Interface: Integrating Electronics into Biocompatible Materials for Next-Generation Medical Devices

Втр, 02/03/2026 - 08:14

By Falgun Jani, Business Head – India Region, Freudenberg Medical – FRCCI

The history of bioelectronics is visually characterised by a transition from early “animal electricity” experiments to sophisticated implantable and wearable technologies. As of today, the boundary between synthetic technology and biological systems is no longer a rigid barrier but a fluid, integrated interface. The field of bioelectronics has undergone a paradigm shift, moving away from “putting electronics in the body” toward “weaving electronics into the tissue”. This evolution is driven by the urgent clinical need for next-gen medical devices that can consistently monitor, diagnose, and treat diseases without triggering the body’s natural defence mechanisms.

Here is a brief history of the evolution of Bioelectronics:

Ancient & Early Modern Era (Pre-1800s)

  • Ancient Medicine: As early as 2750–2500 BC, Egyptians used electric catfish to treat pain. Similar practices continued in Ancient Rome, using torpedo rays for gout and headaches.
  • The Enlightenment: In the 1700s, scientists like Benjamin Franklin used electrostatic machines for medical experiments

The “Animal Electricity” Revolution (18th–19th Century)

  • Luigi Galvani (1780): Often called the “father of bioelectronics,” Galvani observed frog legs twitching when touched with metal scalps, leading to the theory of “animal electricity”—the idea that tissues contain an intrinsic electrical fluid.
  • Alessandro Volta (1800): Volta challenged Galvani, proving the twitching was caused by external metals and an electrolyte (the frog’s tissue). This disagreement led Volta to invent the voltaic pile (the first battery).
  • Matteucci & Du Bois-Reymond (1840s): Carlo Matteucci proved that injured tissue generates electric current, while Emil du Bois-Reymond discovered the “action potential” in nerves.

The Rise of Implantable Technology (20th Century)

  • First Electrocardiogram (1912): Initial references to bioelectronics focused on measuring body signals, leading to the development of the ECG.
  • Cardiac Pacemakers (1950s–1960s):

1958: Rune Elmqvist and Åke Senning developed the first fully implantable pacemaker.

1960: The first long-term successful pacemaker was implanted in the U.S. by Wilson Greatbatch.

  • Cochlear Implants (1961–1970s): William House performed the first cochlear implantation in 1961, and multichannel designs were commercialised by the 1970s.
  • Glucose Biosensors (1962): Leland Clark and Lyons invented the first enzymatic glucose sensor, the foundation for modern diabetes management.
  • Transistors & Miniaturisation: The 1960s saw the transition from bulky vacuum-tube devices to transistor-based implants, enabling the modern era of neuromodulation.

Modern Bioelectronic Medicine (21st Century)

  • The Inflammatory Reflex (2002): Kevin J. Tracey discovered that the Vagus Nerve can regulate the immune system. This “eureka moment” launched the field of Bioelectronic Medicine, treating systemic inflammation (e.g., rheumatoid arthritis) with electrical pulses instead of drugs.
  • Organic Bioelectronics (2010s–Present): Research shifted toward soft, flexible materials like conducting polymers and organic electrochemical transistors (OECTs) to better interface with human tissue.

The Global Bioelectronics Market size was estimated at USD 10.10 billion in 2025 and expected to reach USD 11.27 billion in 2026, at a CAGR of 12.31% to reach USD 22.78 billion by 2032, mainly driven by the rising prevalence of chronic diseases and the demand for personalised, patient-centric healthcare solutions.

 

Key Applications in 2026 Healthcare

The integration of electronics into biocompatible substrates has led to a new class of medical devices that were once the domain of science fiction.

Schematic overview of emerging strategies for bio-inspired electronics and neural interfaces
Neural Interfaces and “Living Electrodes”. From simple deep brain stimulation, we are moving towards Biohybrid Neural Interfaces that use tissue-engineered axons to bridge the gap between a computer chip and the motor cortex. By “growing” biological wires into the brain, these devices achieve a level of chronic stability that allows paralysed patients to control robotic limbs with the same fluid precision as a biological arm.

Soft Bio-Sensing Wearables: Modern-day wearables have moved from the wrist to the skin. “Electronic skin” (e-skin) patches—ultrathin, breathable, and biocompatible—now monitor biochemical markers in sweat, such as cortisol and glucose, in real-time. These devices utilise MXenes and Graphene to detect molecular changes at concentrations previously only reachable via blood draws.

 

Closed-Loop Bioelectronic Medicine: The concept of “electroceuticals” is now a clinical reality. Small, biocompatible devices implanted on the vagus nerve can monitor inflammatory markers and automatically deliver precise electrical pulses to inhibit the “cytokine storm” associated with autoimmune diseases like rheumatoid arthritis and Crohn’s disease.

Some key challenges remain to be resolved:

Engineering Challenges:

  • Stability and Bio-Integration

Despite the progress, engineering the interface remains a complex task. The physiological environment is incredibly harsh—warm, salty, and chemically active—leading to the degradation of many synthetic materials.

  • Hermetic Packaging vs. Biocompatibility

Engineers must find a critical balance between the need to seal sensitive electronics from moisture while ensuring the outer layer is soft enough to integrate with tissue. In 2026, atomic layer deposition (ALD) is used to create nanometer-thin ceramic coatings that provide moisture barriers without adding stiffness.

  • The Power Problem

Traditional batteries are bulky and toxic. Next-generation devices are increasingly powered by biofuel cells that harvest energy from blood glucose or through ultrasonic power transfer, which allows deep-seated implants to be recharged wirelessly through layers of muscle and bone

Ethics and the Regulatory Challenges

  • As we successfully integrate electronics into the human body, the ethical implications have shifted from “safety” to “agency and privacy.”
  • The EU Medical Device Regulation (MDR) and the FDA’s Digital Health Centre of Excellence have established new frameworks for “Neural Data Privacy.”Since these devices can read and potentially influence neural states, the data they produce is classified as a biological asset.
  • Furthermore, the longevity of these devices raises questions about “hardware obsolescence” in living patients. Engineering the interface now includes a roadmap for software updates and long-term support for implants that may stay in the body for decades.

The Future: Toward “Living” Bioelectronics

The trend is moving toward synthetic biology-electronics hybrids. We are seeing the prototypes of devices where genetically engineered cells that produce an electrical signal work as “Sensors” when they detect a specific pathogen or cancer marker.

By engineering the interface at the molecular level, we are not just repairing the body; we are enhancing its resilience.

The integration of electronics into biocompatible materials is more than a technical achievement—it is the foundation of a new era of personalised medicine where the device and the patient are the same.

The post Engineering the Interface: Integrating Electronics into Biocompatible Materials for Next-Generation Medical Devices appeared first on ELE Times.

EU–India FTA: A Defining Moment for India’s Electronics and Semiconductor Industry

Втр, 02/03/2026 - 07:18

As global electronics and semiconductor supply chains are restructured for resilience and trust, the proposed EU–India Free Trade Agreement (FTA) is emerging as a pivotal opportunity for the Indian industry. More than a tariff-reduction exercise, the agreement has the potential to integrate India more deeply into Europe’s advanced electronics and semiconductor value chains. For India, the FTA represents a transition—from cost-driven manufacturing to value-driven, technology-led partnership.

The European Union is one of the world’s most quality-conscious electronics markets, with strong demand across automotive electronics, industrial automation, medical devices, power electronics, renewable energy systems, and telecom infrastructure. Under the EU–India FTA, reduced tariffs and streamlined regulatory frameworks will enhance the competitiveness of Indian electronics products.

Alignment on conformity assessment and technical standards will shorten qualification cycles and lower compliance costs, enabling Indian manufacturers to integrate directly into EU OEM and Tier-1 supply chains. As European companies pursue China-plus-one sourcing strategies, India stands to gain as a reliable and scalable manufacturing base.

India’s electronics industry has historically been assembly-led, but this is changing rapidly. Supported by policy incentives and growing design capabilities, Indian firms are expanding into PCB assembly, system integration, testing, and engineering services.

The EU–India FTA accelerates this shift by encouraging European OEMs to localise higher-value activities in India. Electronics manufacturing services (EMS) providers, component manufacturers, and design-led companies can leverage European partnerships to move beyond box build toward design-for-manufacturing, reliability engineering, and lifecycle management—key to long-term competitiveness.

Semiconductors are central to the EU–India technology partnership. The FTA aligns closely with India’s Semiconductor Mission and the EU Chips Act, creating a stable framework for collaboration across design, packaging, testing, and advanced manufacturing.

India’s strengths in chip design, embedded systems, and engineering talent complement Europe’s leadership in semiconductor equipment, materials, power electronics, and automotive-grade chips. Reduced barriers for capital equipment, technology transfer, and skilled workforce mobility can accelerate joint investments in OSAT, ATMP, and speciality semiconductor manufacturing.

This collaboration positions India as a trusted node in Europe’s semiconductor supply chain diversification efforts.

Beyond large fabs and design houses, the FTA creates opportunities for component and equipment suppliers. Demand for sensors, power modules, passive components, connectors, precision tooling, and clean-room equipment is expected to rise as European electronics and semiconductor companies expand operations in India.

Indian MSMEs operating in these segments can integrate into European Tier-2 and Tier-3 supply chains, benefiting from long-term sourcing contracts, technology upgrades, and exposure to global quality benchmarks.

The EU–India FTA also strengthens innovation linkages. Indian start-ups working in semiconductor IP, AI-enabled hardware, EV electronics, power electronics, and Industry 4.0 solutions will gain improved access to European R&D ecosystems, pilot customers, and funding platforms.

Europe’s strong IP regimes and industrial testbeds offer Indian deep-tech start-ups a credible pathway from development to global commercialisation.

Alignment with European technical, safety, and environmental standards will enhance the global credibility of Indian electronics and semiconductor products. Standards convergence reduces certification duplication, improves supplier trust, and increases acceptance across multiple export markets. For global buyers, this translates into confidence in Indian suppliers—an essential requirement in electronics and semiconductor sourcing.

The EU–India FTA arrives at a defining moment for the electronics and semiconductor industry. With effective execution, it can accelerate India’s shift from assembly-led operations to value-added manufacturing, design, and innovation. More importantly, it positions India as a strategic, trusted partner in global electronics and semiconductor supply chains—built on quality, resilience, and long-term collaboration.

Devendra Kumar
Editor

The post EU–India FTA: A Defining Moment for India’s Electronics and Semiconductor Industry appeared first on ELE Times.

Future-Proofing Bharat: India’s Multi-Billion Dollar AI Strategy Revealed

Пн, 02/02/2026 - 14:29

India is hosting the AI Impact Summit 2026 under the Ministry of Electronics and Information Technology, this month from February 16-20. The Summit has received a phenomenal response from across the world, and is shaping up to be the biggest such event so far globally, said Union Minister Ashwini Vaishnaw on Friday.

The summit week will feature around 500 curated events across Bharat Mandapam and Sushma Swaraj Bhawan. The AI Impact Expo will host over 840 exhibitors, including country pavilions, Ministries, State governments, industry, start-ups, and research institutions, showcasing AI solutions with proven real-world impact.

The conference has confirmed the participation of 15 Heads of State Government, more than 40 Ministers, over 100 leading CEOs and CXOs, and more than 100 eminent academics. Industry partners, including Jio, Qualcomm, OpenAI, Nvidia, Google, Microsoft, Adobe, and the Gates Foundation, are expected to participate in the event.

The Minister also informed that leading IT companies had developed over 200 focused and sector-specific AI models, expected to be launched during the upcoming summit. With investments worth nearly $70 billion already flowing into the AI infrastructure layer, the potential to double it by the conclusion of the event is exponential, he added. The AI talent pool is expected to be scaled up by extending infrastructure and industry-finalised curricula to 500 universities.

Budget Highlights for the AI Sector

Additionally, the government has proposed to focus on developing the AI landscape in India with specific provisions in the Union Budget 2026-27. As the focus shifts to building the digital infrastructure, the government has proposed an additional investment of USD 90 billion, specifically for the AI Data Centres and further encouraged long-term investments by proposing a tax holiday till 2047 for foreign companies providing cloud services to customers globally using data centre services from India. Such companies will provide services to Indian customers through an Indian reseller entity. Simultaneously, a safe harbour of 15 per cent on cost has also been proposed where the data centre service provider in India is a related entity.

The government has proposed several domains for AI integration in the Indian landscape: –

  • Governance: Serving as a force multiplier for improved public service delivery.
  • Supporting new technologies: Adopting new technologies in various sectors through the AI Mission and National Quantum Mission.
  • Labour market analysis: Assessing the impact of emerging tech such as AI on job roles and skill requirements.
  • Bharat-VISTAAR: A new multilingual AI tool designed for broader linguistic accessibility.
  • Agriculture: Integration with AgriStack portals and ICAR agricultural practice packages.
  • Healthcare & accessibility: R&D and integration into assistive devices for People with Disabilities manufactured by Artificial Limbs Manufacturing Corporation of India (ALIMCO).
  • Customs & security: Expanding non-intrusive scanning and advanced imaging for risk assessment.
  • Education: Embedding AI modules directly into the national education curriculum from school level onwards and for teacher training.
  • Professional development: Upskilling and reskilling programs for engineers and tech professionals.
  • Employment matching: AI-enabled platforms to connect workers with jobs and training opportunities.

India’s AI Landscape

Since 2020, the Artificial Intelligence (AI) start-up ecosystem in India has experienced rapid growth, with over 150 native AI start-ups having raised over $1.5 billion in funding as of September 2025.

As of early 2026, there are over 1,900 total AI companies in India, with 555 being funded. AI start-ups have touched several industries to provide a technically advanced perspective to the workings in the industry, from healthcare, agriculture, Aerospace & Defence, navigation, to education, manufacturing, banking, and E-commerce. Some notable start-ups include Sarvam AI, Krutrim, Observe.AI, Avaamo, Nanonets, and Atomicwork in the sector.

Global giants like IBM, Google, Microsoft, OpenAI, and Nvidia have established or expanded their R&D centres, engineering hubs, and regional offices in India to leverage the country’s vast tech talent pool and rapidly expanding digital economy. Domestic players like Perplexity has partners with telecom giants like Airtel to expand their reach. Simultaneously, Anthropic, the AI start-up backed by Google and Amazon, plans to open its first Indian office in Bengaluru in early 2026, focusing on AI tools and tapping into the local developer ecosystem.

Future of AI in India

The AI market is projected to reach $126 billion by 2030, with a long-term contribution of $1.7 trillion to India’s GDP by 2035.

These developments, coupled with the government’s initiative to boost the sector and focus on “Sovereign AI” to reduce dependency on foreign technology and build custom chips within 3-5 years, can position India as a formidable force in the sector of Artificial Intelligence globally, while the country already ranks third globally in AI competitiveness.

By: Shreya Bansal, Sub-Editor

The post Future-Proofing Bharat: India’s Multi-Billion Dollar AI Strategy Revealed appeared first on ELE Times.

Сторінки