Feed aggregator

Edge AI powers the next wave of industrial intelligence

EDN Network - Mon, 11/17/2025 - 16:00
Smart factory.

Artificial intelligence is moving out of the cloud and into the operations that create and deliver products to us every day. Across manufacturing lines, logistics centers, and production facilities, AI at the edge is transforming industrial operations, bringing intelligence directly to the source of data. As the industrial internet of things (IIoT) matures, edge-based AI is no longer an optional enhancement; it’s the foundation for the next generation of productivity, quality, and safety in industrial environments.

This shift is driven by the need for real-time, contextually aware intelligence—systems that can see, hear, and even “feel” their surroundings, analyze sensor data instantly, and make split-second decisions without relying on distant cloud servers. From predictive maintenance and automated inspection to security monitoring and logistics optimization, edge AI is redefining how machines think and act.

Why industrial AI belongs at the edge

Traditional industrial systems rely heavily on centralized processing. Data from machines, sensors, and cameras is transmitted to the cloud for analysis before insights are sent back to the factory floor. While effective in some cases, this model is increasingly impractical and inefficient for modern, latency-sensitive operations.

Implementing at the edge addresses that. Instead of sending vast streams of data off-site, intelligence is brought closer to where data is created, within or around the machine, gateway, or local controller itself. This local processing offers three primary advantages:

  • Low latency and real-time decision-making: In production lines, milliseconds matter. Edge-based AI can detect anomalies or safety hazards and trigger corrective actions instantly without waiting for a network round-trip.
  • Enhanced security and privacy: Industrial environments often involve proprietary or sensitive operational data. Processing locally minimizes data exposure and vulnerability to network threats.
  • Reduced power and connectivity costs: By limiting cloud dependency, edge systems conserve bandwidth and energy, a crucial benefit in large, distributed deployments such as logistics hubs or complex manufacturing centers.

These benefits have sparked a wave of innovation in AI-native embedded systems, designed to deliver high performance, low power consumption, and robust environmental resilience—all within compact, cost-optimized footprints.

Smart factory.Edge-based AI is the foundation for the next generation of productivity, quality, and safety in industrial environments, delivering low latency, real-time decision-making, enhanced security and privacy, and reduced power and connectivity costs. (Source: Adobe AI Generated) Localized intelligence for industrial applications

Edge AI’s success in IIoT is largely based on contextual awareness, which can be defined as the ability to interpret local conditions and act intelligently based on situational data. This requires multimodal sensing and inference across vision, audio, and even haptic inputs. In manufacturing, for example:

  • Vision-based inspection systems equipped with local AI can detect surface defects or assembly misalignments in real time, reducing scrap rates and downtime.
  • Audio-based diagnostics can identify early signs of mechanical failure by recognizing subtle deviations in sound signatures.
  • Touch or vibration sensors help assess machine wear, contributing to predictive maintenance strategies that reduce unplanned outages.

In logistics and security, edge AI cameras provide real-time monitoring, object detection, and identity verification, enabling autonomous access control or safety compliance without constant cloud connectivity. A practical example of this approach is a smart license-plate-recognition system deployed in industrial zones, a compact unit capable of processing high-resolution imagery locally to grant or deny vehicle access in milliseconds.

In all of these scenarios, AI inference happens on-site, reducing latency and power consumption while maintaining operational autonomy even in network-constrained environments.

Low power, low latency, and local learning

Industrial environments are unforgiving. Devices must operate continuously, often in high-temperature or high-vibration conditions, while consuming minimal power. This has made energy-efficient AI accelerators and domain-specific system-on-chips (SoCs) critical to edge computing.

A good example of this trend is the early adoption of the Synaptics Astra SL2610 SoC platform by Grinn, which has already resulted in a production-ready system-on-module (SOM), Grinn AstraSOM-261x, and a single-board computer (SBC). By offering a compact, industrial-grade module with full software support, Grinn enables OEMs to accelerate the design of new edge AI devices and shorten time to market. This approach helps bridge the gap between advanced silicon capabilities and practical system deployment, ensuring that innovations can quickly translate into deployable industrial solutions.

The Grinn–Synaptics collaboration demonstrates how industrial AI systems can now run advanced vision, voice, and sensor fusion models within compact, thermally optimized modules.

These platforms combine:

  • Embedded quad-core Arm processors for general compute tasks
  • Dedicated neural processing units (NPUs) delivering multi-trillion operations per second for inference
  • Comprehensive I/O for camera, sensor, and audio input
  • Industrial-grade security

Equally important is support for custom small language models (SLMs) and on-device training capabilities. Industrial environments are unique. Each factory line, conveyor system, or inspection station may generate distinct datasets. Edge devices that can perform localized retraining or fine-tuning on new sensor patterns can adapt faster and maintain high accuracy without cloud retraining cycles.

The Grinn OneBox AI-enabled industrial SBC.The Grinn OneBox AI-enabled industrial SBC, designed for embedded edge AI applications, leverages a Grinn AstraSOM compute module and the Synaptics SL1680 processor. (Source: Grinn Global) Emergence of compact multimodal platforms

The recent introduction of next-generation SoCs such as Synaptics’ SL2610 underscores the evolution of edge AI hardware. Built for embedded and industrial systems, these platforms offer integrated NPUs, vision digital-signal processors, and sensor fusion engines that allow devices to perceive multiple inputs simultaneously, such as camera feeds, audio signals, or even environmental readings.

Such capabilities enable richer human-machine interaction in industrial contexts. For instance, a line operator can use voice commands and gestures to control inspection equipment, while the system responds with real-time feedback through both visual indicators and audio prompts.

Because the processing happens on-device, latency is minimal, and the system remains responsive even if external networks are congested. Low-power design and adaptive performance scaling also make these platforms suitable for battery-powered or fanless industrial devices.

From the cloud to the floor: practical examples

Collaborations like the Grinn–Synaptics development have produced compact, power-efficient edge computing modules for industrial and smart city deployments. These modules integrate high-performance neural processing, customized AI implementations, and ruggedized packaging suitable for manufacturing and outdoor environments.

Deployed in use cases such as automated access control and vision-guided robotics, these systems demonstrate how localized AI can replace bulky servers and external GPUs. All inference, from image recognition to object tracking, is performed on a module the size of a matchbox, using only a few watts of power.

The results:

  • Reduced latency from hundreds of milliseconds to under 10 ms
  • Lower total system cost by eliminating cloud compute dependencies
  • Improved reliability in areas with limited connectivity or strict privacy requirements

The same architecture supports multimodal sensing, enabling combined visual, auditory, and contextual awareness—key for applications such as worker safety systems that must recognize both spoken alerts and visual cues in noisy and complex factory environments.

Toward self-learning, sustainable intelligence

The evolution of edge AI is about more than just performance; it’s about autonomy and adaptability. With support for custom, domain-specific SLMs, industrial systems can evolve through continual learning. For example, an inspection model might retrain locally as lighting conditions or material types change, maintaining precision without manual recalibration.

Moreover, the combination of low-power processing and localized AI aligns with growing sustainability goals in industrial operations. Reducing data transmission, cooling needs, and cloud dependencies contributes directly to lower carbon footprints and energy costs, critical as industrial AI deployments scale globally.

Edge AI as the engine of industrial transformation

The rise of AI at the edge marks a turning point for IIoT. By merging context-aware intelligence with efficient, scalable compute, organizations can unlock new levels of operational visibility, flexibility, and resilience.

Edge AI is no longer about supplementing the cloud; it’s about bringing intelligence where it’s most needed, empowering machines and operators alike to act faster, safer, and smarter.

From the shop floor to the supply chain, localized, multimodal, and energy-efficient AI systems are redefining the digital factory. With continued innovation from technology partnerships that blend high-performance silicon with real-world design expertise, the industrial world is moving toward a future where every device is an intelligent, self-aware contributor to production excellence.

The post Edge AI powers the next wave of industrial intelligence appeared first on EDN.

imec achieves record GaN breakdown exceeding 650V on Shin-Etsu Chemical’s 300mm QST substrate

Semiconductor today - Mon, 11/17/2025 - 14:04
Tokyo-based Shin-Etsu Chemical Co Ltd says that its QST substrate has been adopted for the 300mm gallium nitride (GaN) power device development program at nanoelectronics research center imec of Leuven, Belgium, where sample evaluation is in progress. In the evaluation, a 5µm-thick high-electron-mobility transistor (HEMT) device achieved a record breakdown voltage, for a 300mm QST substrate, of more than 650V...

imec achieves record GaN breakdown exceeding 650V on Shin-Etsu Chemical’s 300mm QST substrate

Semiconductor today - Mon, 11/17/2025 - 14:04
Tokyo-based Shin-Etsu Chemical Co Ltd says that its QST substrate has been adopted for the 300mm gallium nitride (GaN) power device development program at nanoelectronics research center imec of Leuven, Belgium, where sample evaluation is in progress. In the evaluation, a 5µm-thick high-electron-mobility transistor (HEMT) device achieved a record breakdown voltage, for a 300mm QST substrate, of more than 650V...

Microchip Technology Unveils Model Context Protocol (MCP) Server to Power AI-Driven Product Data Access

ELE Times - Mon, 11/17/2025 - 11:51
Microchip Technology announced the launch of its Model Context Protocol (MCP) Server. An AI interface, the MCP Server connects directly with compatible AI tools and large language models (LLMs) to provide the context these systems need to answer questions. Through simple conversational queries, the MCP Server enables users to retrieve verified, up to date Microchip public data including product specifications, datasheets, inventory, pricing and lead times.
 
Built on MCP streamable HTTP standards, the server delivers context-aware and JSON-encoded responses optimized for AI clients such as Copilots, AI chatbots, LLM-based IDEs and enterprise AI agents. The platform supports a wide range of applications and integrates Microchip public data directly into development environments and intelligent assistants.
“The launch of our MCP Server is another example of how Microchip is leaning into AI and providing AI-based tools that help make life easier for our customers,” said Rich Simoncic, chief operating officer for Microchip Technology. “We’re dedicated to harnessing the power of AI to boost productivity and drive innovation. By enabling instant access to verified product information within the AI platforms developers already rely on, we’re removing barriers and making it easier to design with Microchip solutions.”

The post Microchip Technology Unveils Model Context Protocol (MCP) Server to Power AI-Driven Product Data Access appeared first on ELE Times.

SemiQ adds 7.4, 14.5 and 34mΩ SOT-227 modules to 1200V Gen3 SiC MOSFET line

Semiconductor today - Mon, 11/17/2025 - 11:45
SemiQ Inc of Lake Forest, CA, USA — which designs, develops and manufactures silicon carbide (SiC) power semiconductors and 150mm SiC epitaxial wafers for high-voltage applications — has expanded its family of 1200V Gen3 SiC MOSFETs, launching five SOT-227 modules that offer RDSon values of 7.4mΩ, 14.5mΩ and 34mΩ. The firm’s GCMS modules, which feature Schottky barrier diodes (SBDs), have lower switching losses at high temperature, especially compared to non-SBDs GCMX modules...

SemiQ adds 7.4, 14.5 and 34mΩ SOT-227 modules to 1200V Gen3 SiC MOSFET line

Semiconductor today - Mon, 11/17/2025 - 11:45
SemiQ Inc of Lake Forest, CA, USA — which designs, develops and manufactures silicon carbide (SiC) power semiconductors and 150mm SiC epitaxial wafers for high-voltage applications — has expanded its family of 1200V Gen3 SiC MOSFETs, launching five SOT-227 modules that offer RDSon values of 7.4mΩ, 14.5mΩ and 34mΩ. The firm’s GCMS modules, which feature Schottky barrier diodes (SBDs), have lower switching losses at high temperature, especially compared to non-SBDs GCMX modules...

Пам'яті Мулика Андрія Олександровича

Новини - Mon, 11/17/2025 - 10:34
Пам'яті Мулика Андрія Олександровича
Image
kpi пн, 11/17/2025 - 10:34
Текст

Надійшла інформація про загибель на війні Мулика Андрій Олександрович 18.12.1994 – 05.11.2025...

Мулик Андрій Олександрович - випускник Навчально-наукового інститут прикладного системного аналізу (кафедра системного проєктування).

Відзнака "За заслуги перед КПІ ім. Ігоря Сікорського". Від КПІ з вдячністю та повагою

Новини - Mon, 11/17/2025 - 10:00
Відзнака "За заслуги перед КПІ ім. Ігоря Сікорського". Від КПІ з вдячністю та повагою
Image
kpi пн, 11/17/2025 - 10:00
Текст

За міжнародними й вітчизняними рейтингами КПІ ім. Ігоря Сікорського належить до кращих технічних вишів. Транснаціональні й українські компанії, потужні наукові інституції плідно співпрацюють з Київською політехнікою. Два роки тому в університеті запровадили відзнаку для стратегічних партнерів за вагомий внесок у розвиток університету та підготовку висококваліфікованих фахівців і науковців.

The ecosystem view around an embedded system development

EDN Network - Mon, 11/17/2025 - 06:46

Like in nature, development tools for embedded systems form “ecosystems.” Some ecosystems are very self-contained, with little overlap on others, while other ecosystems are very open and broad with support for everything but the kitchen sink. Moreover, developers and engineers have strong opinions (to put it mildly) about this subject.

So, we developed a greenhouse that sustains multiple ecosystems; the greenhouse demo we built shows multiple microcontrollers (MCUs) and their associated ecosystems working together.

The greenhouse demo

The greenhouse demo is a simplified version of a greenhouse controller. The core premise of this implementation is to intelligently open/close the roof to allow rainwater into the greenhouse. This is implemented using a motorized canvas tarp mechanism. The canvas tarp was created from old promotional canvas tote bags and sewn into the required shape.

The mechanical guides and lead screw for the roof are repurposed from a 3D printer with a stepper motor drive. An evaluation board is used as a rain sensor. Finally, a user interface panel enables a manual override of the automatic (rain) controls.

Figure 1 The greenhouse demo is mounted on a tradeshow wedge. Source: Microchip

It’s implemented as four function blocks:

  1. A user interface, capacitive touch controller with the PIC32CM GC Curiosity Pro (EA36K74A) in VS Code
  2. A smart stepper motor controller reference design built on the AVR EB family of MCUs in MPLAB Code Configurator Melody
  3. A main application processor with SAM E54 on the Xplained Pro development kit (ATSAME54-XPRO), running Zephyr RTOS
  4. A liquid detector using the MTCH9010 evaluation kit

The greenhouse demo outlined in in this article is based on a retractable roof developed by Microchip’s application engineering team in Romania. This reference design is implemented in a slightly different fashion to the greenhouse, with the smart stepper motor controller interfacing directly with the MTCH9010 evaluation board to control the roof position. This configuration is ideal for applications where the application processor does not need to be aware of the current state of the roof.

Figure 2 This retractable roof demo was developed by a design team in Romania. Source: Microchip

User interface controller

Since the control panel for this greenhouse normally would be in an area where water should be expected, it was important to take this into account when designing the user interface. Capacitive touch panels are attractive as they have no moving parts and can be sealed under a panel easily. However, capacitive touch can be vulnerable to false triggers from water.

To minimize these effects, an MCU with an enhanced peripheral touch controller (PTC) was used to contain the effects of any moisture present. Development of the capacitive touch interface was aided with MPLAB Harmony and the capacitive touch libraries, which greatly reduce the difficulty in developing touch applications.

The user interface for this demo is composed of a PIC32CM GC Curiosity Pro (EA36K74A) development kit connected to a QT7 XPlained Pro Extension (ATQT7-XPRO) kit to provide a (capacitive) slider and two touch buttons.

Figure 3 The QT7 Xplained extension kit comes with self-capacitance slider and two self-capacitance buttons alongside 8 LEDs to enable button state and slider position feedback. Source: Microchip

The two buttons allow the user to fully open or close the tarp, while the slider enables partial open or closed configurations. When the user interface is idle for 30 seconds or more, the demo switches back to the MTCH9010 rain sensor to automatically determine whether the tarp should be opened or closed.

Smart stepper motor controller

The smart stepper motor controller is a reference design that utilizes the AVR EB family of MCUs to generate the waveforms required to perform stepping/half-stepping/microstepping of a stepper motor. By having the MCU generate the waveforms, the motor can behave independently, rather than requiring logic or interaction from the main application processor(s) elsewhere in the system. This is useful for signals such as limit switches, mechanical stops, quadrature encoders, or other signals to monitor.

Figure 4 Smart stepper motor reference design uses core independent peripherals (CIPs) inside the MCUs to microstep a bipolar winding stepper motor. Source: Microchip

The MCU receives commands from the application processor and executes them to move the tarp to a specified location. One of the nice things about this being a “smart” stepper motor controller is that the functionality can be adjusted in software. For instance, if analog signals or limit switches are added, the firmware can be modified to account for these signals.

While the PCB attached to the motor is custom, this function block can be replicated with the multi-phase power board (EV35Z86A), the AVR EB Curiosity Nano adapter (EV88N31A) and the AVR EB Curiosity Nano (EV73J36A).

Application processor and other ecosystems

The application processor in this demo is a SAM E54 MCU that runs Zephyr real-time operating system (RTOS). One of the biggest advantages of Zephyr over other RTOSes and toolchains is the way that the application programming interface (API) is kept uniform with clean divisions between the vendor-specific code and the abstracted, higher-level APIs. This allows developers to write code that works across multiple MCUs with minimal headaches.

Zephyr also has robust networking support and an ever-expanding list of capabilities that make it a must-have for complex applications. Zephyr is open source (Apache 2.0 licensing) with a very active user base and support for multiple different programming tools such as—but not limited to—OpenOCD, Segger J-Link and gdb.

Beyond the ecosystems used directly in the greenhouse demo, there are several other options. Some of the more popular examples include IAR Embedded Workbench, Arm Keil, MikroE’s Necto Studio and SEGGER Embedded Studio. These tools are premium offerings with advanced features and high-quality support to match.

For instance, I recently had an issue with booting Zephyr on an MCU where I could not access the usual debuggers and printf was not an option. I used SEGGER Ozone with a J-Link+ to troubleshoot this complex issue. Ozone is a special debug environment that eschews the usual IDE tabs to provide the developer with more specialized windows and screens.

In my case, the issue occurred where the MCU would start up correctly from the debugger, but not from a cold start. After some troubleshooting and testing, I eventually determined one of the faults was a RAM initialization error in my code. I patched the issue with a tiny piece of startup assembly that ran before the main kernel started up. The snippet of assembly that I wrote is attached below for anyone interested.

The moral of the story is that development environments offer unique advantages. An example of this is IAR adding support for Zephyr to its IDE solution. In many ways, the choice of what ecosystem to develop in is up to personal preference.

There isn’t really a wrong answer, if it does what you need to make your design work. The greenhouse demo embodies this by showing multiple ecosystems and toolchains working together in a single system.

Robert Perkel is an application engineer at Microchip Technology. In this role, he develops technical content such as application notes, contributed articles, and design videos. He is also responsible for analyzing use-cases of peripherals and the development of code examples and demonstrations. Perkel is a graduate of Virginia Tech where he earned a Bachelor of Science degree in Computer Engineering.

Related Content

The post The ecosystem view around an embedded system development appeared first on EDN.

Switching power supply vs Linear power supply

Reddit:Electronics - Sun, 11/16/2025 - 18:58
Switching power supply vs Linear power supply

the one on the left is the switched-mode power supply its much smaller and lighter, this one can output twice as much current as the linear power supply on the right

submitted by /u/BlacksmithFar7794
[link] [comments]

Keithley 2000 / 2015 / 2010 VFD to LED display upgrade

Reddit:Electronics - Sun, 11/16/2025 - 00:38
Keithley 2000 / 2015 / 2010 VFD to LED display upgrade

Good news for Keithley 2000 / 2015 / 2016 / 2010 DMM owners with dim displays.
This is a drop-in LED display conversion kit that replaces the original dim VFD.

submitted by /u/dimmog
[link] [comments]

My ±37V 1-1.5A Dual rail linear power supply.

Reddit:Electronics - Sat, 11/15/2025 - 22:32
My ±37V 1-1.5A Dual rail linear power supply.

So this is my power supply i have built. A dual rail ±37V 1-1.5A Linear power supply using lm317 and lm337 so far untill it have built a series pass voltage and current regulator for it just to get it started. Also going to add 0.33 ohm resistor between the 15000uF and 10000uF capacitors. My noise levels are low i think as can be seen in picture 1. I have a soft starter, emi filter on the AC side before transformer, filterd rectifier using small rc filters on each diode, 20d20 ntc, 15000uF, 10000uF. 5630uF +, capacitance multiplier, emi dc filter, another dc filter. Regulator, out.

submitted by /u/Whyjustwhydothat
[link] [comments]

Weekly discussion, complaint, and rant thread

Reddit:Electronics - Sat, 11/15/2025 - 18:00

Open to anything, including discussions, complaints, and rants.

Sub rules do not apply, so don't bother reporting incivility, off-topic, or spam.

Reddit-wide rules do apply.

To see the newest posts, sort the comments by "new" (instead of "best" or "top").

submitted by /u/AutoModerator
[link] [comments]

First Solar selects South Carolina for new US production facility

Semiconductor today - Fri, 11/14/2025 - 19:25
Cadmium telluride (CdTe) thin-film photovoltaic (PV) module maker First Solar Inc of Tempe, AZ, USA is to establish a new facility in Gaffney, Cherokee County, South Carolina, to onshore final production processes for Series 6 Plus modules initiated by the company’s international fleet. The firm expects to spend about $330m to establish the new facility, which is scheduled to begin commercial operation in second-half 2026. The facility is forecasted to create more than 600 new jobs with an average manufacturing salary of $74,000 per year, approximately twice the per capita income in Cherokee County...

💝 Гала-концерт КПІ Арт

Новини - Fri, 11/14/2025 - 18:58
💝 Гала-концерт КПІ Арт
Image
kpi пт, 11/14/2025 - 18:58
Текст

Найбільша творча подія року в КПІ ім. Ігоря Сікорського вже наступного тижня! Запрошуємо на фінал КПІ Арт, під час якого можна буде переглянути виступи 20 фіналістів з різних факультетів у 5 категоріях:

Pages

Subscribe to Кафедра Електронної Інженерії aggregator