Українською
  In English
Microelectronics world news
Next-Generation of Optical Ethernet PHY Transceivers Deliver Precision Time Protocol and MACsec Encryption for Long-Reach Networking
The post Next-Generation of Optical Ethernet PHY Transceivers Deliver Precision Time Protocol and MACsec Encryption for Long-Reach Networking appeared first on ELE Times.
Anritsu Launches Virtual Network Measurement Solution to Evaluate Communication Quality in Cloud and Virtual Environments
Anritsu Corporation announced the launch of its Virtual Network Master for AWS MX109030PC, a virtual network measurement solution operating in Amazon Web Services (AWS) Cloud environments. This software-based solution enables accurate, repeatable evaluation of communication quality across networks, including Cloud and virtual environments. It measures key network quality indicators, such as latency, jitter, throughput, and packet (frame) loss rate, in both one-way and round-trip directions. This software can accurately evaluate end-to-end (E2E) communication quality even in virtual environments where hardware test instruments cannot be installed.
Moreover, adding the Network Master Pro MT1000A/MT1040A test hardware to the network cellular side supports consistent quality evaluation from the core and Cloud to field-deployed devices.
Anritsu has developed this solution operating on Amazon Web Services (AWS) to accurately and reproducibly evaluate end-to-end (E2E) quality under realistic operating conditions even in virtual environments.
The Virtual Network Master for AWS (MX109030PC) is a software-based solution to accurately evaluate network communication quality in Cloud and virtual environments. Deploying software probes running on AWS across Cloud, data center, and virtual networks enables precise communication quality assessment, even in environments where hardware test instruments cannot be located.
The post Anritsu Launches Virtual Network Measurement Solution to Evaluate Communication Quality in Cloud and Virtual Environments appeared first on ELE Times.
Rohde & Schwarz enables MediaTek’s 6G waveform verification with CMP180 radio communication tester
Rohde & Schwarz announced that MediaTek is utilizing the CMP180 radio communication tester to test and verify TC-DFT-s-OFDM, a proposed waveform technology for 6G networks. This collaboration demonstrates the critical role of advanced test equipment in developing foundational technologies for next-generation wireless communications.
TC-DFT-s-OFDM (Trellis Coded Discrete Fourier Transform spread Orthogonal Frequency Division Multiplexing) is being proposed to 3GPP as a potential candidate technology for 6G standardization. MediaTek’s research shows that TC-DFT-s-OFDM delivers superior Maximum Coupling Loss (MCL) performance across various modulation orders, including advanced configurations like 16QAM.
Key benefits of this 6G waveform proposed by MediaTek include enhanced cell coverage through reduced power back-off requirements and improved power efficiency through optimized power amplifier operation techniques such as Average Power Tracking (APT). TC-DFT-s-OFDM enables up to 4dB higher transmission power compared to traditional modulation schemes while maintaining lower interference levels, implying up to 50% gain in coverage area.
“MediaTek’s selection of our CMP180 for their 6G waveform verification work demonstrates the instrument’s capability to support cutting-edge research and development,” said Fernando Schmitt, Product Manager, Rohde & Schwarz. “As the industry advances toward 6G, we’re committed to providing test solutions that enable our customers to push the boundaries of wireless technology.”
The collaboration will be showcased at this year’s Brooklyn 6G Summit, November 5-7, highlighting industry progress toward defining technical specifications for future wireless communications. As TC-DFT-s-OFDM advances through the 3GPP standardization process, rigorous testing using advanced equipment becomes increasingly critical.
The CMP180 radio communication tester is part of the comprehensive test and measurement portfolio from Rohde & Schwarz designed to support wireless technology development from research through commercial deployment.
The post Rohde & Schwarz enables MediaTek’s 6G waveform verification with CMP180 radio communication tester appeared first on ELE Times.
STMicroelectronics powers 48V mild-hybrid efficiency with flexible automotive 8-channel gate driver
The L98GD8 driver from STMicroelectronics has eight fully configurable channels for driving MOSFETs in flexible high-side and low-side configurations. It is able to operate from a 58V supply, and the L98GD8 provides rich diagnostics and protection for safety and reliability.
The 48V power net lets car makers increase the capabilities of mild-hybrid systems including integrated starter-generators, extending electric-drive modes, and enhancing energy recovery to meet stringent new, globally harmonized vehicle-emission tests. Powering additional large loads at 48V, such as the e-compressor, pumps, fans, and valves further raises the overall electrical efficiency and lowers the vehicle weight.
ST’s L98GD8 assists the transition, as an integrated solution optimized for driving the gates of NMOS or PMOS FETs in 48V-powered systems. With eight independent, configurable outputs, a single driver IC controls MOSFETs connected as individual power switches or as high-side and low-side switches in up to two H-bridges for DC-motor driving. It can also provide peak-and-hold control for electrically operated valves. The gate current is programmable, helping engineers minimize MOSFET switching noise to meet electromagnetic compatibility (EMC) regulations.
Automotive-qualified and featured to meet the industry’s high safety and reliability demands, the L98GD8 has per-channel diagnostics for short-circuit to battery, open-load, and short-to-ground faults. Further diagnostic features include logic built in self-test (BIST), over-/under-voltage monitoring with hardware self-check (HWSC), and a configurable communication check (CC) watchdog timer.
In addition, overcurrent sensing allows many flexible configurations while the ability to monitor the drain-source voltage of external MOSFETs and the voltage across an external shunt resistor help further enhance system reliability. There is also an ultrafast overcurrent shutdown with dual-redundant failsafe pins, battery-under voltage monitoring, an ADC for battery and die temperature monitoring, and H-bridge current limiting.
The L98GD8 is in production now, in a 10mm x 10mm TQFP64 package with a budgetary pricing starting at $3.94 for orders of 1000 pieces.
The post STMicroelectronics powers 48V mild-hybrid efficiency with flexible automotive 8-channel gate driver appeared first on ELE Times.
Keysight Advances Quantum Engineering with New System-Level Simulation Solution
Keysight Technologies, announced the release of Quantum System Analysis, a breakthrough Electronic Design Automation (EDA) solution that enables quantum engineers to simulate and optimize quantum systems at the system level. This new capability marks a significant expansion of Keysight’s Quantum EDA portfolio, which includes Quantum Layout, QuantumPro EM, and Quantum Circuit Simulation. This announcement comes at a pivotal moment for the field, especially following the 2025 Nobel Prize in Physics, which recognized advances in superconducting quantum circuits, a core area of focus for Keysight’s new solution.
Quantum System Analysis empowers researchers to simulate the quantum workflow, from initial design stages to system-level experiments, reducing reliance on costly cryogenic testing and accelerating time-to-validation. This integrated approach supports simulations of quantum experiments and includes tools to optimize dilution fridge input lines for thermal noise and qubit temperature estimation.
Quantum System Analysis introduces two transformative features:
- Time Dynamics Simulator: Models the time evolution of quantum systems using Hamiltonians derived from electromagnetic or circuit simulations. This enables accurate simulation of quantum experiments such as Rabi and Ramsey pulsing, helping researchers understand qubit behavior over time.
- Dilution Fridge Input Line Designer: Allows precise modeling of cryostat input lines to qubits, enabling thermal noise analysis and effective qubit temperature estimation. By simulating the fridge’s input architecture, engineers can minimize thermal photon leakage and improve system fidelity.
Chris Mueth, Senior Director for New Markets at Keysight, said: “Quantum System Analysis marks the completion of a truly unified quantum design workflow, seamlessly connecting electromagnetic and circuit-level modeling with comprehensive system-level simulation. By bridging these domains, it eliminates the need for fragmented toolchains and repeated cryogenic testing, enabling faster innovation and greater confidence in quantum system development.”
Mohamed Hassan, Quantum Solutions Planning Lead at Keysight, said: “Quantum System Analysis is a leap forward in accelerating quantum innovation. By shifting left with simulation, we reduce the need for repeated cryogenic experiments and empower researchers to validate system-level designs earlier in the development cycle.”
Quantum System Analysis is available as part of Keysight’s Advanced Design System (ADS) 2026 platform and complements existing quantum EDA solutions. It supports superconducting qubit platforms and is extensible to other modalities such as spin qubits, making it a versatile choice for quantum R&D teams.
The post Keysight Advances Quantum Engineering with New System-Level Simulation Solution appeared first on ELE Times.
Makefile vs. YAML: Modernizing verification simulation flows

Automation has become the backbone of modern SystemVerilog/UVM verification environments. As designs scale from block-level modules to full system-on-chips (SoCs), engineers rely heavily on scripts to orchestrate compilation, simulation, and regression. The effectiveness of these automation flows directly impacts verification quality, turnaround time, and team productivity.
For many years, the Makefile has been the tool of choice for managing these tasks. With its rule-based structure and wide availability, Makefile offered a straightforward way to compile RTL, run simulations, and execute regressions. This approach served well when testbenches were relatively small and configurations were simple.
However, as verification complexity exploded, the limitations of Makefile have become increasingly apparent. Mixing execution rules with hardcoded test configurations leads to fragile scripts that are difficult to scale or reuse across projects. Debugging syntax-heavy Makefiles often takes more effort than writing new tests, diverting attention from coverage and functional goals.
These challenges point toward the need for a more modular and human-readable alternative. YAML, a structured configuration language, addresses many of these shortcomings when paired with Python for execution. Before diving into this solution, it’s important to first examine how today’s flows operate and where they struggle.
Current scenario and challenges
In most verification environments today, Makefile remains the default choice for controlling compilation, simulation, and regression. A single Makefile often governs the entire flow—compiling RTL and testbench sources, invoking the simulator with tool-specific options, and managing regressions across multiple testcases. While this approach has been serviceable for smaller projects, it shows clear limitations as complexity increases.
Below is an outline of key challenges.
- Configuration management: Test lists are commonly hardcoded in text or CSV files, with seeds, defines, and tool flags scattered across multiple scripts. Updating or reusing these settings across projects is cumbersome.
- Readability and debugging: Makefile syntax is compact but cryptic, which makes debugging errors non-trivial. Even small changes can cascade into build failures, demanding significant engineer time.
- Scalability: As testbenches grow, adding new testcases or regression suites quickly bloats the Makefile. Managing hundreds of tests or regression campaigns becomes unwieldy.
- Tool dependence: Each Makefile is typically tied to a specific simulator, for instance, VCS, Questa, and Xcelium. Porting the flow to a different tool requires major rewrites.
- Limited reusability: Teams often reinvent similar flows for different projects, with little opportunity to share or reuse scripts.
These challenges shift the engineer’s focus away from verification quality and coverage goals toward the mechanics of scripting and tool debugging. Therefore, the industry needs a cleaner, modular, and more portable way to manage verification flows.
Makefile-based flow
A traditional Makefile-based verification flow centers around a single file containing multiple targets that handle compilation, simulation, and regression tasks. See the representative structure below.

This approach offers clear strengths: immediate familiarity with software engineers, no additional tool requirements, and straightforward dependency management. For small teams with stable tool chains, this simplicity remains compelling.
However, significant challenges emerge with scale. Cryptic syntax becomes problematic; escaped backslashes, shell expansions, and dependencies create arcane scripting rather than readable configuration. Debug cycles lengthen with cryptic error messages, and modifications require deep Maker expertise.
Tool coupling is evident in the above structure—compilation flags, executable names, and runtime arguments are VCS-specific. Supporting Questa requires duplicating rules with different syntax, creating synchronization challenges.
So, maintenance of overhead grows exponentially. Adding tests requires multiple modifications, parameter changes demand careful shell escaping, and regression management quickly outgrows Maker’s capabilities, forcing hybrid scripting solutions.
These drawbacks motivate the search for a more human-readable, reusable configuration approach, which is where YAML’s structured, declarative format offers compelling advantages for modern verification flows.
YAML-based flow
YAML (YAML Ain’t Markup Language) provides a human-readable data serialization format that transforms verification flow management through structured configuration files. Unlike Makefile’s imperative commands, YAML uses declarative key-value pairs with intuitive indentation-based hierarchy.
See below this YAML configuration structure that replaces complex Makefile logic:


The modular structure becomes immediately apparent through organized directory hierarchies. As shown in Figure 1, a well-structured YAML-based verification environment separates configurations by function and scope, enabling different team members to modify their respective domains without conflicts.

Figure 1 The block diagram highlights the YAML-based verification directory structure. Source: ASICraft Technologies
Block-level engineers manage component-specific test configurations, IP1 andIP2, while integration teams focus on pipeline and regression management. Instead of monolithic Makefiles, teams can organize configurations across focused files: build.yml for compilation settings, sim.yml for simulation parameters, and various test-specific YAML files grouped by functionality.
Advanced YAML features like anchors and aliases eliminate configuration duplication using the DRY (Don’t Repeat Yourself) principle.

Tool independence emerges naturally since YAML contains only configuration data, not tool-specific commands. The same YAML files can drive VCS, Questa, or XSIM simulations through appropriate Python parsing scripts, eliminating the need for multiple Makefiles per tool.
Of course, YAML alone doesn’t execute simulations; it needs a bridge to EDA tools. This is achieved by pairing YAML with lightweight Python scripts that parse configurations and generate appropriate tool commands.
Implementation of YAML-based flow
The transition from YAML configuration to actual EDA tool execution follows a systematic four-stage process, as illustrated in Figure 2. This implementation addresses the traditional verification challenge where engineers spend excessive time writing complex Makefiles and managing tool commands instead of focusing on verification quality.

Figure 2 The YAML-to-EDA phase bridges the YAML configuration. Source: ASICraft Technologies
YAML files serve as comprehensive configuration containers supporting diverse verification needs.
- Project metadata: Project name, descriptions, and version control
- Tool configuration: EDA tool selection, licenses, and version specifications
- Compilation settings: Source files, include directories, definitions, timescale, and tool-specific flags
- Simulation parameters: Tool flags, snapshot paths, and log directory structures
- Test specifications: Test names, seeds, plusargs, and coverage options
- Regression management: Test lists, reporting formats, and parallel execution settings

Figure 3 Here is a view of Python YAML parsing workflow phases. Source: ASICraft Technologies
The Python implementation demonstrates the complete flow pipeline. Starting with a simple YAML configuration:

The Python script loads and processes the configuration below:

When executed, the Python script produces clear output, showing the command translation, as illustrated below:

The complete processing workflow operates in four systematic phases, as detailed in Figure 3.
- Load/parse: The PyYAML library converts YAML file content into native Python dictionaries and lists, making configuration data accessible through standard Python operations.
- Extract: The script accesses configuration values using dictionary keys, retrieving tool names, file lists, compilation flags, and simulation parameters from the structured data.
- Build commands: The parser intelligently constructs tool-specific shell commands by combining extracted values with appropriate syntax for the target simulator (VCS or Xcelium).
- Display/execute: Generated commands are shown for verification or directly executed through subprocess calls, launching the actual EDA tool operations.
This implementation creates true tool-agnostic operation. The same YAML configuration generates VCS, Questa, or XSIM commands by simply updating the tool specification. The Python translation layer handles all syntax differences, making flows portable across EDA environments without configuration changes.
The complete pipeline—from human-readable YAML to executable simulation commands—demonstrates how modern verification flows can prioritize engineering productivity over infrastructure complexity, enabling teams to focus on test quality rather than tool mechanics.
Comparison: Makefile vs. YAML
Both approaches have clear strengths and weaknesses that teams should evaluate based on their specific needs and constraints. Table 1 provides a systematic comparison across key evaluation criteria.

Table 1 See the flow comparison between Makefile and YAML. Source: ASICraft Technologies
Where Makefiles work better
- Simple projects with stable, unchanging requirements
- Small teams already familiar with Make syntax
- Legacy environments where changing infrastructure is risky
- Direct execution needs required for quick debugging without intermediate layers
- Incremental builds where dependency tracking is crucial
Where YAML excels
- Growing complexity with multiple test configurations
- Multi-tool environments supporting different simulators
- Team collaboration where readability matters
- Frequent modifications to test parameters and configurations
- Long-term maintenance across multiple projects
The reality is that most teams start with Makefiles for simplicity but eventually hit scalability walls. YAML approaches require more expansive initial setup but pay dividends as projects grow. The decision often comes down to whether you’re optimizing for immediate simplicity or long-term scalability.
For established teams managing complex verification environments, YAML-based flows typically provide better return on investment (ROI). However, teams should consider practical factors like migration effort and existing tool integration before making the transition.
Choosing between Makefile and YAML
The challenges with traditional Makefile flows are clear: cryptic syntax that’s hard to read and modify, tool-specific configurations that don’t port between projects, and maintenance overhead that grows with complexity. As verification environments become more sophisticated, these limitations consume valuable engineering time that should focus on actual test development and coverage goals.
The YAML-based flows address these fundamental issues through human-readable configurations, tool-independent designs, and modular structures that scale naturally. Teams can simply describe verification intent—run 100 iterations with coverage—while the flow engine handles all tool complexity automatically. The same approach works from block-level testing to full-chip regression suites.
Key benefits realized with YAML
- Faster onboarding: New team members understand YAML configurations immediately.
- Reduced maintenance: Configuration changes require simple text edits, not scripting.
- Better collaboration: Clear syntax eliminates the “Makefile expert” bottleneck.
- Tool flexibility: Switch between VCS, Questa, or XSIM without rewriting flows.
- Project portability: YAML configurations move cleanly between different projects.
The choice between Makefile and YAML approaches ultimately depends on project complexity and team goals. Simple, stable projects may continue benefiting from Makefile simplicity. However, teams managing growing test suites, multiple tools, or frequent configuration changes will find YAML-based flows providing better long-term returns on their infrastructure investment.
Meet Sangani is ASIC verification engineer at ASICraft Technologies.
Hitesh Manani is senior ASIC verification engineer at ASICraft Technologies.
Shailesh Kavar is ASIC verification technical manager at ASICraft Technologies.
Related Content
- Addressing the Verification Bottleneck
- Making Verification Methodology and Tool Decisions
- Gate level simulations: verification flow and challenges
- Specifications: The hidden bargain for formal verification
- Shift-Left Verification: Why Early Reliability Checks Matter
The post Makefile vs. YAML: Modernizing verification simulation flows appeared first on EDN.
Wolfspeed cuts quarterly loss after CapEx slashed during restructuring
NUBURU completes first phase of Orbit acquisition
Infineon’s new MOTIX system-on-chip family for motor control enables compact and cost-efficient designs
Infineon Technologies is expanding its MOTIX 32-bit motor control SoC (System-on-chip) family with new solutions for both brushed (BDC) and brushless (BLDC) motor applications: TLE994x and TLE995x. The new products are tailored for small- to medium-sized automotive motors, ranging from functions such as battery cooling in electric vehicles to comfort features such as seat adjustment. The number of such motors continues to grow in modern, especially electric, vehicles and they are used in an increasing number of safety-critical applications.
Therefore, car manufacturers require reliable, compact and cost-effective solutions that
integrate multiple functions. Based on Infineon’s extensive experience in motor control, the
new SoCs combine advanced integration with functional safety and cybersecurity-relevant
features.
The three-phase TLE995x (BLDC) is ideal for pumps and fans in thermal management
systems, while the two-phase TLE994x (BDC) targets comfort functions such as electric
seats and power windows. Both devices integrate advanced diagnostic and protection
functions that support reliable motor operation.
By combining a gate driver, microcontroller, communication interface, and power supply in a single chip, Infineon’s SoCs offer exceptional functionality with minimal footprint. The new LIN-based devices feature an Arm Cortex-M23 core running up to 40 MHz, with
integrated flash and RAM. Field-Oriented Control (FOC) capability ensures efficient and
precise motor operation. Compared to the established TLE986x/7x family, the TLE994x/5x
offers enhanced peripherals, flexible PWM generation via the CCU7, and automatic LIN
message handling to reduce CPU load. All devices comply with ISO 26262 (ASIL B) for
functional safety. Additionally, the integrated Arm TrustZone technology provides a
foundation for improved system security.
The post Infineon’s new MOTIX system-on-chip family for motor control enables compact and cost-efficient designs appeared first on ELE Times.
India’s Battery Manufacturing Capacity Projected to Reach 100 GWh by Next Year: Experts
India’s battery manufacturing capacity stands at nearly 60 GWh and is projected to reach 100 GWh by next year, said Mr. Nikhil Arora, Director, Encore Systems.
Arora said that with automation efficiencies crossing 95% and advanced six-axis robotics handling 625Ah, 12kg cells, we are driving large-scale localization in the energy storage value chain. Our sodium-based cell technologies, safer, highly recyclable and ideal for grid-scale storage reflect India’s growing self-reliance in clean energy.
“Collaborations with IIT Roorkee, NIT Hamirpur and local automation partners are accelerating innovation and technology transfer. As storage costs fall from ₹1.77 to ₹1.2 per unit in five years, India is set to achieve cost parity between solar and storage, advancing its journey toward energy independence.” Arora said while speaking at the 18th Renewable Energy India Expo in Greater Noida.
Speaking at the event, Mr. Ankit Dalmia, Partner, Boston Consulting Group, said “India’s next five years will be shaped by advances in battery storage, digitalization, and green hydrogen. New emerging chemistries such as LFP, sodium-ion and solid-state batteries could cut storage costs by up to 40% by 2030, enabling 24×7 renewable power. AI-driven grid management and smart manufacturing are improving reliability and reducing system costs by nearly 20%. The National Green Hydrogen Mission, targeting 5 million tonnes of production annually by 2030, is positioning India to capture about 10% of global green-hydrogen capacity.”
“With the right policy support, manufacturing scale-up and global partnerships, India can become a resilient, low-cost hub for clean energy and battery innovation. India’s clean-energy ecosystem represents a US$200–250 billion investment opportunity this decade, with targets of 500 GW of renewables and 200 GWh of storage by 2030. Investors are focusing on hybrid RE + storage, grid-scale batteries, and pumped storage projects, while companies leverage AI and digital twins for smarter grid integration. Despite policy and land challenges, strong momentum and falling costs are powering rapid growth.” he further added.
Mr. Arush Gupta, CEO, OKAYA Power Private Limited, said” OKAYA has powered over 3 million Indian households with its inverter and power backup solutions and is now accelerating its presence in solar and lithium storage. With a new ₹140 crore facility coming up in Neemrana, we’re scaling both lithium and inverter production to meet growing residential demand. Solar is projected to contribute nearly 40% of our business within the next five years, driven by initiatives like the PM Suryaghar Muft Bijli Yojana. As India targets 1 crore solar-powered homes, our focus is on providing efficient, digitally enabled rooftop solutions built on indigenous technology. By integrating advanced BMS and power electronics, we aim to make every Indian household energy independent and future-ready.”
Acharya Balkrishna, Head, Patanjali, said “At Patanjali, our vision has always been to contribute to the nation’s development and people’s prosperity through Swadeshi solutions be it in health, wellness, or daily essentials. Extending the same philosophy to renewable energy, we are committed to advancing solar and battery technologies that reduce foreign dependence and make clean energy affordable for all. Solar energy, a divine and continuous source, holds the key to meeting India’s growing power needs at minimal cost. Through Swadeshi-driven innovation and collaboration, we aim to ensure that sustainable and economical solar solutions reach every household in the country.”
Mr. Inderjit Singh, Founder & Managing Director, INDYGREEN Technologies, said “We provide customized battery solutions across L5, C&I, and utility-scale BESS segments, designed to balance performance, scale, and economics for Indian customers. With over 100 battery assembly lines successfully implemented, we aim to expand multifold in the next two years, targeting over 20 GWh of battery lines and 20 GW of solar PV manufacturing solutions. Leveraging IoT and AI-driven technologies, we enhance battery safety, thermal management, and lifecycle efficiency while supporting OEMs and Tier 1 suppliers with advanced insulation and fire-safety systems. Additionally, we’re enabling India’s industrial lithium cell ecosystem through pilot-line infrastructure for premier institutes and labs, alongside showcasing high-efficiency solar cell lines and large-scale BESS assembly solutions at REI and The Battery Show India.”
Sharing perspective on the co-located expos, Mr. Yogesh Mudras, Managing Director, Informa Markets in India, said, “India’s clean energy transition is accelerating faster than ever, with renewable capacity surpassing 250 GW in 2025 and a strong pipeline targeting 500 GW by 2030. The Ministry of Power has approved a ₹5,400 crore Viability Gap Funding (VGF) scheme for 30 GWh of Battery Energy Storage Systems (BESS), in addition to 13.2 GWh already underway, which is expected to attract ₹33,000 crore in investments by 2028.
The post India’s Battery Manufacturing Capacity Projected to Reach 100 GWh by Next Year: Experts appeared first on ELE Times.
Evil Sine Wave tutorial (a lot of people have been asking for this)
| | submitted by /u/Porphyrin_Wheel [link] [comments] |
EEVblog 1717 - Rigol's INSANE New $999 350MHz Oscilloscope: TEARDOWN
Latest issue of Semiconductor Today now available
Thought you might like this small neon bulb driver
| | Thought you might like this little circuit that drives an usual neon bulb. Difference from usual bulbs you salvage lies in the fact that the bulb must not have a resistor attached. I removed mine from the neon bulb fuse-like package. For anyone wondering, I found this in an old probing screwdriver that broke. Transistor + phone charger transformer + a resistor. Take time to measure the coils. My multimeter isn't precise at all but I measured the coils to be 0.6, 1.2 and 6.7R. Once I measure it better, I will post the results but all three that I built have approximately the same ratios between them. I am providing a bare schematic, the rest of the components on the boeard are a tactile switch, li-po charger and a battery connector. Interesting thing is that the voltage accross the bulb is polarized and only one side of the bulb lights up (negative I believe). I love the circuit and the vibe and I hope I'm not the only one. [link] [comments] |
Astable Multivibrator Using BJT
| | not able to add video [link] [comments] |
555 oscillator
| This is my 555 timer circuit in action.The green waveform shows the capacitor charging and discharging, while the yellow trace flips high and low each time the voltage crosses its thresholds. It’s a simple demo, but it illustrates how analog voltage turns into digital logic. (Still learning) [link] [comments] |
I made a camera from an optical mouse. 30x30 pixels in 64 glorious shades of gray!
| | I was digging through some old stuff and found a PCB from a mouse I'd saved long ago specifically because I knew it was possible to read images from them. The new project itch struck and after 65 hours, I made this! Features: It was a fun design challenge to make this thing as small as I could, the guts are completely packed. There's a ribbon cable connecting the electronics in the two halves, I tried to cram in a connector (0.05" pitch header) but it was too bulky to fit. The panorama "smear shot" is definitely my favorite mode, it scans out one column at a time across the screen as you sweep the camera. It's scaled 2x vertically but 1x horizontally, so you get extra "temporal resolution" horizontally if you do the sweep well. The construction style is also something I enjoy for one-off projects. No PCB, just cobble together stuff I've got plus whatever extra parts I need and design the case to fit. If I ever made more I'd make a board for sure (and it would shrink the overall size), but it's fun to hand-make stuff like this. Despite the low resolution, it's easily possible to take recognizable pictures of stuff. The "high" color depth certainly helps. I'd liken it to the Game Boy Camera (which I also enjoy), which is much higher resolution but only has 4 colors! I tried to post a video for you all but they're not allowed here. :( I'll link it in the comments once I cross-post to another subreddit. [link] [comments] |
First time making a real plasma toroidal discharge in a glass sphere
I made a simple push pull oscillator circuit that has no problem lighting up stable toroidal discharges. It works so well, much better than those single transistor class e oscillator circuit you find everywhere, they always have a hard time igniting the discharge. My project draws about 40W and at most about 100W, I think it is a lot, but the effects it creates are fun to watch.
[link] [comments]
Weekly discussion, complaint, and rant thread
Open to anything, including discussions, complaints, and rants.
Sub rules do not apply, so don't bother reporting incivility, off-topic, or spam.
Reddit-wide rules do apply.
To see the newest posts, sort the comments by "new" (instead of "best" or "top").
[link] [comments]
Computer-on-module architectures drive sustainability

Sustainability has moved from corporate marketing to a board‑level mandate. For technology companies, this shift is more than meeting environmental, social, and governance frameworks; it reflects the need to align innovation with environmental and social responsibility among all key stakeholders.
Regulators are tightening reporting requirements while investors respond favorably to sustainable strategies. Customers also want tangible progress toward these goals. The debate is no longer about whether sustainability belongs in technology roadmaps but how it should be implemented.
The hidden burden of embedded and edge systemsElectronic systems power a multitude of devices in our daily lives. From industrial control systems and vital medical technology to household appliances, these systems usually run around the clock for years on end. Consequently, operating them requires a lot of energy.
Usually, electronic systems are part of a larger ecosystem and are difficult to replace in the event of failure. When this happens, complete systems are often discarded, resulting in a surplus of electronic waste.
Rapid advances in technology make this issue more pronounced. Processor architectures, network interfaces, and security protocols become obsolete in shorter cycles than they did just a few years ago. As a result, organizations often retire complete systems after a brief service life, even though the hardware still meets its original requirements. The continual need to update to newer standards drives up costs and can undermine sustainability goals.
Embedded and edge systems are foundational technologies driving critical infrastructure in industrial automation, healthcare, and energy applications. As such, the same issues with short product lifecycles and limited upgradeability put them in the same unfortunate bucket of electronic waste and resource consumption.
Bridging the gap between performance demands and sustainability targets requires rethinking system architectures. This is where off-the-shelf computer-on-module (COM) designs come in, offering a path to extended lifecycles and reduced waste while simultaneously future-proofing technology investments.
How COMs extend product lifecyclesOpen embedded computing standards such as COM Express, COM-HPC, and Smart Mobility Architecture (SMARC) separate computing components—including processors, memory, network interfaces, and graphics—from the rest of the system. By separating the parts from the whole, they allow updates by swapping modules instead of by requiring a complete system redesign.
This approach reduces electronic waste, conserves resources, and lowers long‑term costs, especially in industries where certifications and mechanical integration make complete redesigns prohibitively expensive. These sustainability benefits go beyond waste reduction: A modular system is easier to maintain, repair, and upgrade, meaning fewer devices end up prematurely as electronic waste.
Recommended Why system consolidation for IT/OT convergence matters
Open standards that enable longevityTo simplify the development and manufacturing of COMs and to ensure interchangeability across manufacturers, consortia such as the PCI Industrial Computer Manufacturing Group (PICMG) promote and ratify open standards.
One of the most central standards in the embedded sector is COM Express. This standard defines various COM sizes, such as Type 6 or Type 10, to address different application areas; it also offers a seamless transition from legacy interfaces to modern differential interfaces, including DisplayPort, PCI Express, USB 3.0, or SATA. COM Express, therefore, serves a wide range of use cases from low-power handheld medical equipment to server-grade industrial automation infrastructure.
Expanding on these efforts, COM-HPC is the latest PICMG standard. Addressing high-performance embedded edge and server applications, COM-HPC arose from the need to meet increasing performance and bandwidth requirements that previous standards couldn’t achieve. COM-HPC COMs are available with three pinout types and six sizes for simplified application development. Target use cases range from powerful small-form-factor devices to graphics-oriented multi-purpose designs and robust multi-core edge servers.
COM-HPC, including congatec’s credit-card-sized COM-HPC Mini, provides high performance and bandwidth for all AI-powered edge computing and embedded server applications. (Source: congatec)
Alongside COM Express and COM-HPC, the Standardization Group for Embedded Technologies developed the SMARC standard to meet the demands of power-saving, energy-efficient designs requiring a small footprint. Similar in size to a credit card, SMARC modules are ideal for mobile and portable embedded devices, as well as for any industrial application that requires a combination of small footprint, low power consumption, and established multimedia interfaces.
As credit-card-sized COMs, SMARC modules are designed for size-, weight-, power-, and cost-optimized AI applications at the rugged edge. (Source: congatec)
As a company with close involvement in developing COM Express, COM-HPC, and SMARC, congatec is invested in the long-term success of more sustainable architectures. Offering designs for common carrier boards that can be used for different standards and/or modules, congatec’s approach allows product designers to use a single carrier board across many applications, as they simply swap the module when upgrading performance, removing the need for complex redesigns.
Virtualization as a path to greener systemsOn top of modular design, extending hardware lifecycles requires intelligent software management. Hypervisors, a software tool that creates and manages virtual machines, add an important software layer to the sustainability benefits of COM architectures.
Virtualization allows multiple workloads to coexist securely on a single module, meaning that separate boards aren’t required to run essential tasks such as safety, real-time control, and analytics. This consolidation simultaneously lowers energy consumption while decreasing the demand for the raw materials, manufacturing, and logistics associated with more complex hardware.
Hypervisors such as congatec aReady.VT are real-time virtualization software tools that consolidate functionality that previously required multiple dedicated systems in a single hardware platform. (Source: congatec)
Enhancing sustainability through COM-based designs
The rapid adoption of technologies such as edge AI, real‑time analytics, and advanced connectivity has inspired industries to strive for scalable platforms that also meet sustainability goals. COM architectures are a great example, demonstrating that high performance and environmental responsibility are compatible. They show technology and business leaders that designing sustainability into product architectures and technology roadmaps, rather than treating it as an afterthought, makes good practical and financial sense.
With COM-based modules already providing a flexible and field-proven foundation, the embedded sector is off to a good start in shrinking environmental impact while preserving long-term innovation capability.
The post Computer-on-module architectures drive sustainability appeared first on EDN.



