Українською
  In English
EDN Network
Flip-flop plus choke comprise simple and cheap inductive sensor

Sensors that detect and track small metallic objects are handy gadgets, whether used to route workpieces on an assembly line or comb a beach for “treasures” lost in the sand. A typical sensor design consists of an inductor integrated into an oscillator, so that entry of metal into the magnetic field of the inductor changes the frequency of oscillation. Figure 1 shows a particularly simple and cheap example that produces a ~100 kHz variable frequency pulse train suitable for direct input to the internal counter/timer peripherals of typical MCUs.
Wow the engineering world with your unique design: Design Ideas Submission Guide
Figure 1 Cross-couple flip-flop outputs with a choke to make an oscillator with period proportional to inductance and thus a simple sensor of metallic objects.
Here’s how it works.
Q1, Q2, and associated Rs comprise an elementary set/reset bistable multivibrator (flip-flop) that, for as long as power is provided, would normally settle into, and hold one of two stable and mutually exclusive states: Q1 ON and Q2 OFF, or vice-versa. But things get more interesting (and less stable) when inductor L is added between the transistor collectors as shown.
Now, due to the voltage difference between the collector of the OFF transistor (at ~2.5 V) and that of the ON transistor (at ~0V) and, assuming choke series resistance << Rn (n = 1, 2, 3, or 4), current starts ramping up through L with a time-constant of L/(R/2). This ramps down the voltage at the OFF transistor’s collector and thereby the current supplied to the base of the ON transistor. Eventually the base current drops too low for the current gain of the transistor to hold it in saturation, allowing it to turn OFF. Whereupon the choke current (IL) drives the collector of this transistor to 5 V, and switches the opposite transistor ON, causing IL to begin to reverse and a new half-cycle of oscillation to begin, each half-cycle having a period of:
Thalf = (L/(R/2))Loge(hfe),
R = Rn n = 1, 2, 3, or 4; L = choke inductance, and
hfe = transistor current gain.
Thus,
Tcycle = 2Thalf = (L/R)Loge(hfe),
Fout = 1/Tcycle = (R/L)Loge(hfe),
Typical hfe for the 2N3904 is ~150, therefore Loge(hfe) ~ 5, and
Fout = 5R/L.
Figure 2 and Figure 3 show in better detail where the timing relationships come from. Using the Figure 1 resistor and inductor values,
Fout = 5 * 1000 Ω / 2 mH = 100 kHz.
Figure 2 Inductor current wave shape during one ~5µs oscillation half-cycle.
Figure 3 Oscillator waveforms: IL and Q1, Q2 output signals.
Of course, actual 2N3904 hfe is both device and temperature dependent, and real chokes have significant series resistance, parasitic capacitance, and are seldom precision components to begin with. So, the frequency expression above is somewhat approximate, but is nevertheless accurate enough to not interfere with the intended application.
Power consumption from 5 V is moderate at ~50 mW, and the circuit is extremely tolerant of supply voltage, functioning from <1 V to >10 V, provided that the fact that the output amplitude is supply-dependent is acceptable.
Stephen Woodward’s relationship with EDN’s DI column goes back quite a long way. Nearly 100 submissions have been accepted since his first contribution back in 1974.
Related Content
- Rethink the button: An inductive-sensing approach
- Inductor-based astable 555 timer circuit
- The difference between inductive proximity, displacement, and eddy-current sensors
- Take-back-half thermostat uses ∆Vbe transistor sensor
- A short primer on festive failed filament finder
The post Flip-flop plus choke comprise simple and cheap inductive sensor appeared first on EDN.
A sneak peek at chiplet standards

The scaling of system-on-chip (SoC) architectures is hitting the wall, paving the way for die-to-die interconnect in heterogenous single-package systems commonly known as chiplets. But while these chiplet-optimized interconnect technologies are gaining significant traction, they are still in their infancy.
That makes chiplet interconnect standards crucial for the new multi-die semiconductors era. Below is a brief outline of three standards that are considered critical in the present evolution of chiplets. These standards will likely play a vital role in creating an open chiplet ecosystem.
- Bunch of Wires
The Bunch of Wires (BoW) interconnect technology defines an open and interoperable physical interface between a pair of dies inside a single package. It specifies a physical layer (PHY) optimized for SoC disaggregation to form the basis of multi-die interconnect for chiplets.
Eliyan’s founding CEO Ramin Farjadrad, who developed the original interconnect technology behind BoW, took it to the Open Compute Project (OCP) in 2018 for standardization. The technology was later adopted by the OCP as a chiplet interconnect scheme.
Figure 1 BoW is an open PHY specification for die-to-die (D2D) interconnect that offers parallel interfaces and can be implemented in organic laminate or advanced packaging technologies. Source: Eliyan
- Universal Chiplet Interconnect Express (UCIe)
The open industry standard for die-to-die connectivity was introduced in March 2022 by a consortium of over 80 companies, including semiconductor and packaging firms, foundries, and cloud services and IP suppliers. It’s an important step toward heterogeneous integration with multi-die systems and it’s aiming to create a new design ecosystem for semiconductor chiplets.
Figure 2 UCIe defines the key performance indicators for chiplets within a package. Source: UCIe
UCIe provides a plug-and-play interconnect at the package level and streamlines interoperability between dies on different process technologies from various suppliers. It’s based on the same signaling and clocking schemes and architecture basics as BoW interconnect and is currently available with the UCIe 1.1 specification.
- High Bandwidth Memory (HBM)
Though not a chiplet standard specifically, it’s becoming a vital ingredient in chiplet designs for its ability to pack a larger number of memory chips into a smaller space. HBM enables various layers of memory chips to be stacked on top of each other by employing vertical channels called through-silicon vias (TSVs). It was originally designed to reduce the data travel distance between the memory and the processor.
Figure 3 HBM, originally designed for high-performance computing (HPC) applications, is acquiring a critical role in chiplets design ecosystem. Source: Eliyan
HBM, initially conceived for compute-intensive applications in data centers and cloud computing, is now highly relevant in chiplet designs for its ability to vertically stack DRAM chips on top of one another. That’s why several new chiplet solutions now support UCIe as well as HBM protocol.
Related Content
- Chiplet interconnect handles 40 Gbps/bump
- IP partnerships stir the world of FPGA chiplets
- Chiplets Gain Popularity, Integration Challenges
- Chiplets advance one design breakthrough at a time
The post A sneak peek at chiplet standards appeared first on EDN.
An LED display adapted for DIY projects

This expandable LED display for a microcontroller in Figure 1 has a simple interface with only 6 data/control wires and can be easily accommodated to a DIY design. The display has static indication.
Figure 1 The expandable LED display for a microcontroller has a relatively simple interface with only 6 data/control wires.
Wow the engineering world with your unique design: Design Ideas Submission Guide
The main parameters are:
- Digits: 8 Hex digits (may be more, expandable)
- Data & control wires: 6 (minimum)
- Power voltage: 3…6 V
- Power consumption: < 3…5 mA/digit (Rlim in range 2 k – 2.4 k for E = 3,3 V or, Rlim in range 4.3 k – 5.1 k for E = 5 V)
If you choose the segment current too low (lower than ~0.7 mA) some segments may become much darker than others. For this reason, it would be wise to check the performance of your indicators with the low current.
The circuit in Figure 1 is optimized for 7 segment indicators such as the A-522SR (dual, with common anode and super-bright LEDs) and DIP versions of the 74HC259 (8-bit addressable latch) and 74HC137 (decoder).
By placing the pair latch/indicator in close vicinity to one another, we can strongly reduce not only the dimensions of the display, but the amount of soldering as well. This characteristic makes the circuit very suitable for DIY designs. In this design, we can solder the Rlim resistors (SMD 0805) directly between the corresponding legs of the latch and indicator. The pin assignment is shown for this case.
The algorithm static LED display: Initial/displaying state: P3=1, P4=0, all other pins are of no importance.
The steps required to change the data on the display are as follows:
- Set a digit’s address: P2..P0 = address
- P3 = negative strobe 1-0-1 to latch the address
- Set a segment’s address: P2..P0 = address
- Set a value of D: P5=0 if the segment must be ON, P5=1 if OFF
- P4 = positive strobe 0-1-0 to latch the value of D
- For all the other segments, repeat from starting from step 3. When all the segments are done, go to step 1 while there are unattended digits.
This principle can be expanded for more digits simply using an analogous decoder with more legs.
—Peter Demchenko studied math at the University of Vilnius and has worked in software development.
Related Content
- LED current regulator has low dropout
- A safe adjustable regulator
- Extending the resolution of a peripheral DAC
- Shunt circuit clips large transients or regulates voltage
- How to design LED signage and LED matrix displays, Part 1
- My seven-segment LED-dimming mystery
The post An LED display adapted for DIY projects appeared first on EDN.
mmWave chip enables multibeam multiplexing

Fujitsu has developed a mmWave chip that supports multibeam multiplexing (excluding polarization multiplexing) for use in the radio units (RUs) of 5G base stations. The technology enables up to four beams to be multiplexed by a single chip through the use of mmWave beamforming.
With conventional technologies, a single millimeter-wave chip is used to generate a single beam, resulting in larger RUs and increased power consumption. When the newly developed technology was applied in actual base stations, Fujitsu demonstrated that it is possible to achieve high speed and high capacity communications at 10 Gbps or more in half the size of a conventional RU. This technology will allow for systems with fewer millimeter-wave chips, which can ultimately reduce power consumption by as much as 30% per RU.
Fujitsu’s development effort was undertaken as part of the Research and Development Project of the Enhanced Infrastructures for Post-5G Information and Communication Systems commissioned by Japan’s New Energy and Industrial Technology Development Organization (NEDO). The company aims to begin worldwide commercial deployment of RUs equipped with the multibeam multiplexing technology in fiscal year 2024. Subsequently, the technology will be applied to base station centralized unit and distributed unit (CU/DU) products and offered globally in fiscal 2025.
A datasheet for the mmWave chip was not available at the time of this announcement.
Find more datasheets on products like this one at Datasheets.com, searchable by category, part #, description, manufacturer, and more.
The post mmWave chip enables multibeam multiplexing appeared first on EDN.
Software automates mobile app testing

Eggplant 7.0 from Keysight enables quality assurance (QA) teams to test mobile apps on multiple devices and operating systems simultaneously. The enhanced automated test software can test more than 7500 operating system and device combinations through Sauce Labs’ Real Device Cloud.
Version 7.0 provides QA teams with improved collaboration features, seamless integration with version control and continuous integration (CI) tools, and the ability to connect with virtualized applications. Together, these capabilities enable automation and acceleration, while mitigating the risks inherent in manual testing.
Eliminating the need for costly device labs, Eggplant 7.0 provides instant test execution on any device and operating system via Sauce Labs’ cloud platform without having to manage physical devices. Running test cases simultaneously reduces overall test time and accelerates release cycles.
The test software now integrates with the Git version control system to allow teams to roll back to previous versions, compare changes, and collaborate across multiple branches. Eggplant also works with popular CI tools to incorporate test automation into continuous integration/continuous deployment (CI/CD) pipelines to identify and fix issues.
For more information or to obtain a price quote, use the product page link below.
Find more datasheets on products like this one at Datasheets.com, searchable by category, part #, description, manufacturer, and more.
The post Software automates mobile app testing appeared first on EDN.
Dual SiC MOSFET handles 2200 V

Toshiba has begun volume shipping of the MG250YD2YMS3, a dual SiC MOSFET module with a drain-source voltage rating of 2200 V for use in industrial equipment. Intended for 1500-VDC applications, such as photovoltaic power systems and energy storage systems, the module provides a continuous drain current of 250 A (500 A pulsed) and can operate at channel temperatures up to 150°C.
The MG250YD2YMS3 offers low conduction loss with a low drain-source on-voltage (sense) of 0.7 V typical. It also has turn-on and turn-off switching loss of 14 mJ and 11 mJ, respectively. These characteristics contribute to higher equipment efficiency. Low switching loss also allows a conventional three-level circuit to be replaced with a two-level circuit with a lower module count, which helps to downsize industrial equipment.
Key specifications for the MG250YD2YMS3 include:
Use the product page link below to access the datasheet for the MG250YD2YMS3 dual SiC MOSFET module.
Toshiba Electronic Devices & Storage
Find more datasheets on products like this one at Datasheets.com, searchable by category, part #, description, manufacturer, and more.
The post Dual SiC MOSFET handles 2200 V appeared first on EDN.
Automotive Hall ICs have high voltage tolerance

Two series of Hall effect ICs from Rohm have a withstand voltage of 42 V, enabling direct connection to a vehicle’s primary 12-V battery power supply. High withstand voltage contributes to improved reliability under battery power, which can fluctuate rapidly depending on the operating conditions.
Intended for automotive applications requiring magnetic detection, the BD5310xG-CZ series provides unipolar detection, while the BD5410xG-CZ series offers latch-type detection. Both series are AEC-Q100 Grade 1 qualified and have a wide operating supply voltage range of 2.7 V to 38 V.
A total of eleven models span detection magnetic flux densities ranging from 2.0 mT to 28.0 mT. Unipolar detection can be used for detecting position in applications such as vehicle door open/close and door locks. Latch detection is useful for detecting rotation in various motors used for power windows and sliding doors.
The automotive Hall ICs are housed in SSOP3A packages and operate over a temperature range of -40°C to +150°C. Devices are available through Rohm’s distributor network.
BD5310xG-CZ series product page
BD5410xG-CZ series product page
Find more datasheets on products like this one at Datasheets.com, searchable by category, part #, description, manufacturer, and more.
The post Automotive Hall ICs have high voltage tolerance appeared first on EDN.
Hall ICs conserve power in battery-operated devices

Unipolar Hall effect switches from Diodes operate over a supply range of 1.6 V to 5.5 V with an average supply current of just 1.1 µA at 1.85 V. The low-power devices are designed for proximity sensing in battery-powered devices, including smartphones, laptops, and wearables. Their wide operating voltage range allows them to be powered directly from a device’s battery.
Output 1 in the dual-output AH139x series responds to a north pole, while output 2 responds to a south pole. Typical operating points for the AH1391 and AH1392 are 25 G and 30 G, respectively. The single-output AH138x series responds to a south pole on the part-marking side with typical operating points of 18 G (AH1381), 30 G (AH1382), and 45 G (AH1383).
According to the manufacturer, the tight operating window of these switches ensures a lower magnetic spread. Their chopper-stabilized design provides minimal switch-point drift and ensures temperature stability. Further, the parts have push-pull outputs that do not require an external pull-up resistor.
Suitable for small form-factor applications, all of the Hall switches are offered in compact DFN1010-4 and DFN1410-4 packages. AH138x devices are additionally available in SOT23 packages. The AH1391 and AH1392 in DFN packages cost $0.18 each in lots of 1000 units. The AH1381, AH1382, and AH1383 cost $0.17 for DFN versions and $0.16 for SOT23 versions in like quantities.
Find more datasheets on products like this one at Datasheets.com, searchable by category, part #, description, manufacturer, and more.
The post Hall ICs conserve power in battery-operated devices appeared first on EDN.
Arduino’s newest gambits: Connectivity, industrial designs

Arduino has made its next move to bolster connectivity offerings by joining the AWS Partner Network (APN), a global community of over 100,000 cloud partners from more than 150 countries. It’s aimed to boost the Arduino PRO product line introduced in 2020 at the request of OEMs and system integrators.
Arduino PRO features 24 industrial-grade products, including the Portenta X8 Linux SOM and UL-certified Opta PLC, and is deployed by more than 2,000 businesses worldwide.
Figure 1 Arduino Pro, a microcontroller-based board, can be programmed with the Arduino software download and powered via a USB header, battery, or external power supply. Source: Arduino
The announcement underscores two major shifts at the open-source developer platform. First, Arduino wants users to easily create connectivity applications via its cloud platform commonly known as Arduino Cloud. Second, it aims to move beyond a prototype or educational platform and transition toward commercial and industrial applications.
Arduino’s cloud journey
Arduino Cloud offers users an easy path to collect data, control the edge, and gain insights from connected products without the need to build, deploy, and maintain a custom Internet of Things (IoT) platform. The 3-year-old Arduino Cloud is built on AWS and processes 4 billion device messages every month.
Figure 2 Arduino Cloud is an online platform to configure, program, and connect devices with a dashboard that allows users to monitor and control Arduino boards from a web interface. Source: Arduino
Take the example of ABM Vapor Monitoring, which supervises commercial buildings across the United States to ensure that regulated air quality standards are met. The company claims to have slashed product development time by six months and saved over $250,000 in engineering services while using Arduino Cloud.
Arduino has spent the last few years developing the IoT cloud to ensure more users can develop connected products. However, its UNO boards didn’t offer connectivity. So, earlier this year, Arduino released the UNO R4 boards with more powerful processors alongside Wi-Fi and Bluetooth connectivity. “It became very simple to create connectivity products for mobile apps or the web that can be controlled remotely and integrated very easily with the cloud,” said Massimo Banzi, Arduino’s co-founder, chairman and CMO.
Figure 3 UNO R4, powered by a 32-bit microcontroller, comes with a Wi-Fi variant to allow users to connect to the Arduino Cloud and other platforms for IoT projects. Source: Arduino
“Arduino doesn’t just develop boards,” Banzi added. “It’s a combination of development environment and cloud community, which makes Arduino very easy.” He also pointed toward the open architecture at the core of every Arduino product that provides a preferred path to AWS for chips supported by Arduino.
Moreover, in a high-level framework that allows users to migrate code to different platforms, libraries abstracted to connectivity channels like Bluetooth make it easier for users to develop connectivity applications. “We have developed a lot of libraries abstracted for high-level tasks, and everything is open source as much as possible,” Banzi said.
He gave the example of sensor-based connectivity applications in an Arduino environment. “You don’t need to read a lot of datasheet pages to figure out how to connect a sensor,” he said. “You search the sensor’s name, find one or more libraries developed by the Arduino community, and get going very quickly.”
Arduino in America
Arduino’s new AWS partnership also marks a shift beyond the common perception that it’s a prototyping or educational platform. Furthermore, while it’s making advances toward commercial and industrial applications, Arduino has decided to establish a local presence in the United States. Arduino has set up two new offices in the United States and named Guneet Bedi the head of U.S. operations.
“Arduino has been hugely popular in the United States, and we have a large community here,” Banzi said. “However, with Arduino being perceived as a prototyping or educational platform, we were able to manage this particular market with local partners without direct involvement.”
But with the launch of Arduino PRO, which is targeted at industrial products, Arduino must cater to large companies. “So, to serve these kinds of customers, we need to have a local team,” he added. “The local team can figure out what these companies need.”
Connectivity and industrial-grade applications mark a new chapter in Arduino’s design journey spanning nearly 15 years. And the latter part, which focuses on commercial and industrial applications, is intrinsically tied to its renewed presence in the United States. Exciting times at the open-source hardware pioneer with 32 million active developers worldwide.
Related Content
- Arduino Catches IoT Wave
- The 5 Best Arduino Projects
- Arduino board plugs DIYers into the cloud for $69
- Linux-Friendly Arduino Simplifies IoT Development
- Open-source HW in the Modern Era: Interview of Arduino’s CEO Fabio Violante
The post Arduino’s newest gambits: Connectivity, industrial designs appeared first on EDN.
RLD-based astable 555 timer circuit

In the classic configuration and most variants of the astable 555 multivibrator circuit, the timing characteristics are based on the charging and discharging of a capacitor. However, it can be argued that since the exponential voltage of a capacitor is qualitatively similar to inductor current, the latter can be made an alternative timing element for the 555. This was shown in the “Inductor-based astable 555 timer circuit”. In Figure 1, we present another approach for an inductor-based astable 555 multivibrator.
Figure 1 An astable 555 timer circuit based on an inductor, diode, and resistor.
Wow the engineering world with your unique design: Design Ideas Submission Guide
At power-on, the inductor voltage (VL) spikes up and exceeds the 555’s trigger voltage of 2Vcc/3. Output (Vo) at pin 3 goes low and the discharge transistor at pin 7 turns on which provides a low-resistance path to ground. Inductor current (IL) begins to rise as VL and the voltage at pin 2 (V2) and pin 6 (V6) all fall exponentially.
When V6 gets below the 555’s threshold voltage of Vcc/3, Vo goes high, and the discharge transistor turns off. Because IL was interrupted, the inductor’s voltage reverses which forward-biases the flywheel diode (D). Pin 7 gets clamped to a diode forward voltage above Vcc. Both IL and VL start to fall towards zero while V2 climbs toward Vcc.
When V2 crosses 2Vcc/3 again, Vo goes low, the discharge transistor turns on, and the train of regular high and low output pulses ensues. The expected waveforms are shown in Figure 2.
Figure 2 The simulated waveforms using Tinkercad (setting: 15 µs/div).
For each state of Vo, we derived the first-order differential equation of the effective circuit. This led us to the Equation 1 for calculating the pulse widths:
The symbols are defined in Table 1 where the columns for TH and TL list specific values that the symbols take on. We also considered Rs as the inductor’s DC resistance, RON = 59.135 / Vcc0.8101 as the resistance of the discharge transistor at pin 7 (Refer to “Design low-duty-cycle timer circuits”), and VD=0.6 V as the diode forward voltage.
Table 1 Formulas to predict timing characteristics.
To test these ideas, we prepared a spreadsheet calculator that predicts TH, TL, and other output characteristics. Then we picked the components listed in Table 2, used a digital LCR tester (SZBJ BM4070) to measure their actual values, and plugged the numbers into the calculator. The predicted attributes of Vo are listed in Table 3.
Table 2 Components for the experimental circuit.
Table 3 Predicted versus measured values (Vcc=5.00 volts).
Finally, we connected to our laptop, a USB-powered test and measurement device—the Digilent Analog Discovery 3 (AD3)—to supply +5 V to the experimental circuit (Figure 3) and observe the waveforms from pins 2 and 6, and 3 of the IC (Figure 4). We tested 8 chips from a bin of assorted 555s and noticed that while TH was consistent, the TL values annoyingly lacked precision. Nonetheless when we compared the AD3 Measurements with the Predicted values in Table 3, we saw that Equation 1 fairly modeled the output of the new multivibrator.
Figure 3 Experimental set-up with the Digilent Analog Discovery 3 connected to the +5 V to the experimental circuit.
Figure 4 Waveforms of V2, V6, and Vo, and measurements for Vo.
Arthur Edang (M.Sc) taught Electronics and Communications Engineering courses at the Don Bosco Technical College (Mandaluyong, Philippines) for 25 years. His current interests include nonlinear phenomena and chaos in circuits, creative approaches to teaching and research, and adaptive e-books. He started Thinker*Tinker—the YouTube channel where viewers can “examine circuits, play with their equations, and make designs work.”
Maria Lourdes Lacanilao-Edang (M.Engg) has instructed a diverse range of courses in the field of computer engineering, from Basic Electronics to Computer Design Interface Projects. Currently serving as faculty member at the University of Santo Tomas (Manila, Philippines), she specializes in the IT Automation track with particular interests in embedded systems, web and mobile app development, and IoT.
Related Content
- Inductor-based astable 555 timer circuit
- Design low-duty-cycle timer circuits
- Schmitt trigger provides alternative to 555 timer
- 555 timer triggers phase-control circuit
- Adjustable triangle/sawtooth wave generator using 555 timer
The post RLD-based astable 555 timer circuit appeared first on EDN.
IP partnerships stir the world of FPGA chiplets

The tie-ups between IP suppliers of embedded FPGA (eFPGA) and UCIe chiplets mark a new era of FPGA chiplet integration in die-to-die connectivity. Chiplets are rapidly being adopted as heterogeneous multi-chip solutions to enable lower latency, higher bandwidth, and lower cost solutions than discrete devices connected via traditional interconnects on a PCB.
Take YorChip, a supplier of UCIe-compatible IP, which is employing QuickLogic’s eFPGA IP technology to create the first UCIe-compatible FPGA chiplet ecosystem. Unified Chiplet Interconnect Express (UCIe) is an open standard for connecting small, modular blocks of silicon called chiplets. Kash Johal, founder of YorChip, calls his company’s partnership with QuickLogic a giant leap for FPGA technology.
Figure 1 QuickLogic and YorChip have partnered to develop the industry’s first UCIe-enabled FPGA.
The two companies claim that this strategic partnership aims to enable an ecosystem allowing chiplet developers to create a customized system and use chiplets for prototyping and doing early market production. QuickLogic teamed up with connectivity IP supplier eTopus in a similar tie-up last year to create a disaggregated eFPGA-enabled chiplet template solution.
QuickLogic combined its Australis eFPGA IP Generator with chiplet interfaces from eTopus to produce standard eFPGA-enabled chiplet templates. Each template will be designed with native support for chiplet interfaces, including the bunch of wires (BOW) and UCIe standards. According to QuickLogic, unlike discrete FPGAs with pre-determined resources of FPGA lookup tables (LUTs), RAM, and I/Os, the disaggregated eFPGA-enabled chiplet template will be available initially as a configurable IP and eventually as known good die (KGD) chiplets.
Figure 2 The disaggregated eFPGA chiplet template solution supports both BOW and UCIe interfaces.
Such collaborations to create eFPGA–enabled chiplet solutions mark an important trend in developing chip-to-chip interconnect technology. In April 2023, the European research institute Fraunhofer IIS/EAS entered a collaboration with eFPGA IP supplier Achronix to build a heterogeneous chiplet solution.
Fraunhofer IIS/EAS, which provides system concepts, design services and fast prototyping in most advanced packaging technologies, will use Speedcore eFPGA IP from Achronix to explore chip-to-chip transaction layer interconnects such as BOW and UCIe. One key application in this project covers the connection of high-speed analog-to-digital converters (ADCs) alongside Achronix eFPGA IP for pre-processing in radars as well as wireless and optical communication.
Figure 3 Fraunhofer IIS/EAS has selected Achronix’s eFPGAs to build a heterogeneous chiplet demonstrator.
Brian Faith, CEO of QuickLogic says that these efforts to use eFPGA for building heterogeneous chiplets embody a new era of FPGA chiplet integration. And he sees their application in the evolving edge IoT and AI/ML markets. Nevertheless, the design journey toward building the world of FPGA chiplets has already started and we are likely to hear about more such partnerships incorporating FPGAs into chip-to-chip interconnect technology.
Related Content
- Chiplet interconnect handles 40 Gbps/bump
- Chiplets Gain Popularity, Integration Challenges
- Chiplets advance one design breakthrough at a time
The post IP partnerships stir the world of FPGA chiplets appeared first on EDN.
Desktop DC source shows real precision where it’s needed

I’ve always been intrigued with high-precision instruments, as it usually represents the best of engineering design, craftsmanship, elegance, and even artistry. One of earliest and still best examples that I recall was when I saw the weigh-scale design by the late, legendary Jim Williams, published nearly 50 years ago in EDN. His piece, “This 30-ppm scale proves that analog designs aren’t dead yet,” details how he designed and built a portable, AC-powered scale for nutritional research using standard components and with extraordinary requirements: extreme resolution of 0.01 pound out of 300.00 pounds, accuracy to 30 parts per million (ppm), and no need for calibration during its lifetime.
Jim’s project was a non-production, one-off unit and while its schematic (Figure 1) was obviously important, tells only part of the story, there are many more lessons in his description.
Figure 1 This schematic from Jim Williams’ 1976 EDN article on design of a precision scale teaches many lessons, but there’s much more than just the schematic to understand. Source: Jim Williams
To meet his objectives, he identified every source of error or drift and then methodically minimized or eliminated each one via three techniques: using better, more-accurate, more-stable components; employing circuit topologies which self-cancelled some errors; and providing additional “insurance” via physical EMI and thermal barriers. In this circuit, the front-end needed to extract a miniscule 600-nV signal (least significant digit) from a 5-V DC level—a very tall order.
I spoke to Jim a few years before his untimely passing, after he had written hundreds of other articles (see “A Biography of Jim Williams”), and he vividly remembered that design and the article as the event which made him realize he could be a designer, builder, and expositor of truly unique precision, mostly analog circuits.
Of course, it’s one thing to handcraft a single high-performance unit, but it’s a very different thing to build precision into a moderate production-volume instrument. Yet companies have been doing this for decades, as typified by—but certainly not limited to—Keysight Technologies (formerly known as Agilent and prior to that, Hewlett-Packard) and many others, too many to cite here.
Evidence of this is seen in the latest generation of optical test and measurement instruments, designed to capture and count single photons. That’s certainly a mark of extreme precision because individual photons generally don’t have much energy, don’t like to be assessed or captured, and self-destruct when you look at them.
I recently came across another instrument that takes a simple function to an extreme level of precision: the DC205 Precision Voltage Source from Stanford Research Systems. This desktop unit is much more than just a power supply, as it provides a low-noise, high-resolution output which is often used as a precision bias source or threshold in laboratory-science experiments (Figure 2).
Figure 2 This unassuming desktop box represents an impressively high level of precision and stability in an adjustable voltage source. Source: Stanford Research Systems
Its bipolar, four-quadrant output delivers up to 100 V with 1-μV resolution and up to 50 mA of current. It offers true 6-digit resolution with 1 ppm/°C stability (24 hours) and 0.0025 % accuracy (one year).
Two other features caught my attention: it uses a linear power supply (yes, they are still important in specialty applications) to minimize output noise, presumably only for the voltage-output block but not for the entire instrument. There’s also the inclusion of an DB-9 RS-232 connector in addition to its USB and fiber optic interfaces. I haven’t seen an RS-232 interface in quite a while, but I presume they had a good reason to include it.
The block diagram in the User’s Manual reveals relatively little, except to indicate the unit has three core elements which combine to deliver the instrument’s performance: a low-noise, stable voltage reference; a high-resolution digital-to-analog converter; and a set of low-noise, low-distortion amplifiers (Figure 3).
Figure 3 As with Jim Williams’ scale, the core functions of the SR205 look simple, and may be so, but it is also the unrevealed details of the implementation that make the difference in achieving the desired performance. Source: Stanford Research Systems
I certainly would like to know more of the design and build details that squeeze such performance out of otherwise standard-sounding blocks.
As this unit targets lab experiments in physics, chemistry, and biology disciplines, it also includes a feature that conventional voltage sources would not include: a scanning (ramping) capability. This triggerable voltage-scanning feature gives user control over start and stop voltages, scan speed, and scan function, with scan speeds settable from 100 ms to 10,000 s, and the scan function can either be a ramp or a triangle wave. Further, for operating in the 100-V “danger zone”, the user must plug a jumper into the rear panel to deliberately and consciously allow operation in the region.
In addition to the DB-9 RS-232 interface supporting legacy I/O and an optical link for state-of-the-art I/O, I noticed another interesting feature called in the well-written, readable, crisp, and clear user’s manual: how to change the AC-line voltage setting. Some instruments I have seen use a slide switch and some use plug-in jumpers (don’t lose them), but this instrument uses a thumbwheel rotating selector as shown in Figure 4.
Figure 4 Even a high-end instrument must deal with different nominal power-line voltages, and this rotary switch in the unit makes changing the setting straightforward and resettable. Source: Stanford Research Systems
In short, this is a very impressive standard-production instrument with precision specifications and performance, with what seems like a very reasonable base price of around $2300.
I think about it this way: In the real world of sensors, signal conditioning, and overall analog accuracy achieving stable, accurate performance to 1% is doable with reasonable effort; getting to 0.1% is much harder and reaching 0.01% is a real challenge. Yet both custom and production-instrumentation designers have mastered the art and skill of going far beyond those limits.
It’s similar to when I first saw a list of fundamental physical constants such as the mass or moment of an electron, and which had been measured (not defined) to seven or eight significant figures with an error only that last digit. I felt compelled to do further investigation to understand how they reached that level of precision and confidence, and how they credibly assessed their sources of error and uncertainty.
What’s the tightest measurement or signal-source accuracy and precision you have had to create? How did you confirm the desired level of performance was actually achieved—if it was?
Bill Schweber is an EE who has written three textbooks, hundreds of technical articles, opinion columns, and product features.
Related Content
- Power needs go beyond just plain voltage and current
- Learning to like high-voltage op-amp ICs
- Jim Williams’ contributions to EDN
- The Quest for Quiet: Measuring 2nV/√Hz Noise and 120dB Supply Rejection in Linear Regulators, Part 2
- Transistor ∆VBE-based oscillator measures absolute temperature
The post Desktop DC source shows real precision where it’s needed appeared first on EDN.
Amazon Sidewalk network is getting silicon traction

New silicon solutions are emerging for Amazon Sidewalk network, and these chips come alongside developer tools providing step-by-step direction and expert advice for Amazon Sidewalk device development.
At its fourth annual Works With Developers Conference, Silicon Labs unveiled two system-on-chips (SoCs) optimized for Amazon Sidewalk: SG23 and SG28. These chips complement the Silicon Labs Pro Kit for Amazon Sidewalk previously announced by the Austin, Texas-based semiconductor supplier.
Figure 1 Amazon Sidewalk is built on an architecture comprising a radio, network, and application layers. Source: Silicon Labs
The always-on, community-driven Amazon Sidewalk is a shared network that helps devices like Amazon Echo, Ring security cameras, outdoor lights, motion sensors, and tile trackers work better at home and beyond the front door. It uses three different radios: Bluetooth LE for device provisioning and nearby device connectivity, sub-GHz FSK for connectivity up to one mile, and a proprietary CSS radio for extreme long-range.
Most Amazon Sidewalk end-devices will support Bluetooth LE and one of the two long-range protocols: FSK or CSS operating at 900 MHz frequencies to cover longer distances. So, SG28 includes two dual-band SoCs with radios for sub-GHz FSK as well as Bluetooth LE. That allows device makers to simplify designs and reduce costs by having the two most used radios on Sidewalk end-devices in one package. On the other hand, SG23 provides security and a robust sub-GHz link budget for long-range, end-node devices.
Figure 2 The two SoCs are optimized for Amazon Sidewalk with extensive developer support. Source: Silicon Labs
Amazon Sidewalk is one of the exciting developments in the Internet of Things (IoT) space since it was launched in 2019. It pushes connectivity beyond the walls of the smart home while employing smart home devices like cameras and speakers as gateways for supporting long-range use cases. According to Amazon, it’s a community network built by the community for the community.
A neighborhood network
Silicon Labs CTO Daniel Cooley calls Amazon Sidewalk a neighborhood network. “While Bluetooth gives users an easy way to provision and deploy new devices onto the network, the sub-Ghz band is designed to support device communications over one mile, allowing for new edge applications in areas like smart agriculture and smart cities.”
Figure 3 Amazon Sidewalk Bridges will pick up the message from the compatible device and route it through the AWS cloud to the user with multiple layers of security. Source: Silicon Labs
Besides chips like SG23 and SG28, Silicon Labs has launched a design kit that supports the development of wireless IoT-based devices on Bluetooth and sub-GHz wireless protocols for Amazon Sidewalk. The Wireless Pro Kit is built around a KG100S radio board that provides a complete reference design to support Bluetooth, FSK, and CSS protocols used in Amazon Sidewalk.
The kit also includes a BG24 radio board and FSK/CSS adapter board for developers who want a discrete design. Its mainboard contains an onboard J-Link debugger with a packet trace interface and a virtual COM port, enabling application development and debugging of the attached radio board as well as external hardware through an expansion header.
Figure 4 The Pro Kit for Amazon Sidewalk provides the necessary tools for developing high-volume, scalable IoT applications. Source: Silicon Labs
Silicon Labs has been working closely with Amazon to navigate the Amazon Sidewalk development process. After all, it’s a new network, and developers need to be educated on how to best create Amazon Sidewalk devices. Recognizing this need, Silicon Labs has joined hands with Amazon to create the Amazon Sidewalk Developer’s Journey with Silicon Labs.
Amazon Sidewalk was opened for developers on 28 March 2023.
Related Content
- It’s Amazon’s Sidewalk, You Just Live On It!
- What’s This Sidewalk Thing? Alexa! ‘Splain!
- MediaTek, Amazon Aim to Lead in Smart Homes
- Top smart home trends plus: is Matter the key to interoperability?
- Silicon Labs dual band SoCs for Amazon Sidewalk BLE and sub-GHz FSK
The post Amazon Sidewalk network is getting silicon traction appeared first on EDN.
LCDs: Evolutions, innovations, and differentiations stave off irrelevance

It seems like just yesterday…but in reality, it was nearly 20 years ago. What am I talking about? My first substantive coverage of direct-view displays appeared in EDN on March 5, 2005, and I’m feeling déjà vu as I read back over it. Cathode ray tube (CRT) based displays were at the time dominant for both television and (desktop) computer applications, with plasma displays restricted to ultra-large screen applications and videophiles, the latter valuing their black levels and other image quality attributes. And liquid crystal displays (LCDs)? They were in laptops, of course, albeit with much lower quality (not to mention higher cost) than is the case today.
Speaking of cost, although LCD technology had also begun penetrating the standalone computer display market in 2005, this quote from an AnandTech writeup of the era tells the tale (bracketed additions for clarification are mine):
People on a budget might still prefer a good 19″ CRT, which can save about $100 on the cost of the [equivalent 19” liquid crystal] display.
And at the time, the pricing disparity between CRTs and LCDs further (exponentially so) grew as screen size increased. Conversely, today it’s getting increasingly difficult to even find a 19” LCD computer monitor, with the bulk of the market moving to 22” and larger units. And LCD TVs in 2005? Virtually nonexistent, although that year ended up being the knee of the upward curve.
Five-plus years later, in my follow-up feature article, the display market had notably evolved, a situation reflected in the writeup’s “Display-technology advancements: Change is the only constant” title. Here’s how I launched into it:
Repeatedly predicted and repeatedly delayed on many occasions, the transition from CRTs to LCDs has finally occurred, even in cost-sensitive emerging markets and across dominant application segments: computer monitors and televisions.
And as LCDs were maturing, next-generation display technologies such as OLED (which received little more than passing mention in my piece a half-decade earlier) were coming to the fore:
Some [editor note: LCD] developers focus their efforts on making incremental improvements to a “vanilla” LCD foundation. Other cases warrant a more revolutionary transition—to OLEDs (organic light-emitting diodes) for an ultra-svelte consumer-electronics device, for example, or to an “electronic-paper” display fora digital reader.
Fast forward to today, and it’s OLED that’s ascendent, increasingly at LCDs’ expense. By virtue of its self-illumination characteristics, negating the need for a bulky, rigid, and power-hungry separate backlight (historically cold cathode fluorescent lamp—CCFL—based, now near-exclusively LCD-derived), OLEDs find use in all but the most cost-sensitive smartphones, and are essential for emerging unified-display foldable phones. They haven’t quite made it to larger-screen devices—specifically Apple’s iPads—yet, although persistent rumor suggests it’s a matter of when, not if, and Android-based OLED-equipped tablet alternatives are already in the market. OLEDs are, for perhaps obvious reasons, already pervasive in smaller-screen smart watches. For computer monitors, some pundits believe this is the year that OLED-based displays will finally go mainstream. And while it’s less clear (at least to me) that OLED will end up in volume televisions, several emerging alternatives are also vying to be LCD’s successor, as I discussed in March 2019: specifically, QLED and microLED.
If you’re an LCD supplier, where does this ongoing impermanence leave you? To some degree, the answer depends on whether you’re also an OLED, QLED and/or microLED supplier, although given that LCD technology is more mature, its manufacturing yields are still likely higher and its costs are subsequently lower, enabling you to price LCDs more aggressively than alternatives and trade off per-unit profit margin for higher unit sales. But clearly, if you’ve sunk a lot of money into developing LCD supply capacity and don’t have a ready-for-production technology alternative in your hip pocket, you’re definitely going to want to “milk” the LCD market as long as possible to recoup your investment (and more) as much as possible.
Fundamental spec improvements are one key means of accomplishing this market-life-extension objective. Boosting the peak refresh rate is one popular example. Although I’m admittedly skeptical about the reality behind the hype (upfront disclosure: I’m not a gamer, so consider my lack of eyes-on experience when assessing my cynicism), it does seem to be effective both in spurring new (and replacement) display sales and in differentiating display companies from their competitors. Take, for example, ASUS’ VA229HR 21.5” LCD computer monitor, based on in-phase switching (IPS) technology and touting 75 Hz refresh, two of which I bought last fall. Intended for the computers that I periodically build and donate to local charities, they were both used—one was from Amazon’s Warehouse, the other an Amazon Refreshed unit—therefore costing me only ~$75 each. But their brand-new equivalents, if memory serves me, were selling for ~$125 at the time, versus ~$100 for conventional 60 Hz refresh equivalents.
Admittedly, those conventional equivalents probably sold in higher volumes, albeit with lower per-unit profit margins. And the conventional-unit competitive environment was also likely much more crowded. To that latter point, nowadays “high refresh” typically translates to “120 Hz or more” versus my more modest last-year uptick. And to my earlier-admitted skepticism about the concept, reflective of dubiousness as to whether such spec improvements are meaningfully perceptible in reality (analogies to audiophile gear and content are apt), I’m conversely enthusiastic about the ability to dynamically throttle refresh rate downward if content characteristics allow, as a means of maximizing overall system battery life. Microsoft apparently agrees, judging from upcoming enhancements it’s making to Windows 11.
Note, too, that the ASUS VA229HR is also an IPS LCD monitor, seemingly belying its low price and high refresh rate. Both attributes were historically strengths of the alternative twisted-nematic (TN) LCD approach, along with that of a third technology—vertical alignment (VA)—which I neglected to mention back in 2019. But IPS’ “improved viewing angles, deeper black levels, and other enhancements”, to quote my earlier article, won out, with refresh rates also boosted over time and price addressed by volume cost efficiencies.
More generally, a steady stream of image quality improvements is key to actualizing LCD suppliers’ aspirations to keep their preferred display technology relevant in the face of upstart options. Other improvement focus areas include response time, deeper black levels and associated wider contrast ratios, along with wide viewing angles, expanded color gamuts, and higher resolutions (both in general and at a given panel size), all implemented via schemes like:
- Finer-pitch pixels (and subpixels)
- Backlights with zone-based local dimming, and implemented using multicolor LCDs, and
- A variety of “glass” coating approaches
not to mention the high-bandwidth interfaces on which many LCD improvements also depend.
As an admittedly somewhat extreme example of the lengths that LCD suppliers have gone to remain germane, look at Apple’s 32” Pro Display XDR, which was announced back in mid-2019:
Here are some choice excerpts from the press release that publicly introduced it:
- 6016 x 3384 Retina 6K resolution with more than 20 million pixels
- P3 wide color gamut and true 10-bit color for over 1 billion colors
- The industry’s best polarizer technology, delivering a superwide, color-accurate, off-axis viewing angle.
- To manage reflected light, Pro Display XDR has an industry-leading anti-reflective coating and offers an innovative new matte option called nano-texture, with glass etched at the nanometer level for low reflectivity and less glare.
- A direct backlighting system with a large array of LEDs that produce 1,000 nits of full-screen brightness and 1,600 nits of peak brightness [editor note: 1,000 nits sustained]
- With a single Thunderbolt 3 cable, Pro Display XDR connects seamlessly to the Mac product line, including the new Mac Pro, which supports up to six displays for a breathtaking 120 million pixels.
Sounds great, right? Here’s the reality check:
Pro Display XDR starts at $4,999, the Pro Stand is $999 and the VESA Mount Adapter is $199.
In fairness, last March Apple subsequently unveiled a more modestly priced (comparatively, at least) LCD, the 27” 5K Studio Display, alongside its first-generation Mac Studio computers:
Studio Display is $1,599 (US), and $1,499 (US) for education. Additional technical specifications, including nano-texture glass and a choice of stand options, are available at apple.com/store.
Its peak brightness and other specs are more modest (while still quite impressive, mind you), which in combination with its smaller panel size and associated lower resolution, all assist with the comparative-to-Pro Display XDR price decrease. And speaking of specs, curiously neither Apple display seemingly documents the oft-important response time metric.
Despite its lower price, the Studio Display adds some features that I more generally wanted to highlight as an integration trend that other LCD suppliers are also adopting:
- A 12MP Ultra Wide camera with Center Stage, a feature that automatically keeps users centered in the frame as they move around.
- Studio Display also includes a studio-quality, three-microphone array with an especially low noise floor for crystal-clear calls and voice recordings.
- A high-fidelity six-speaker sound system, the best ever created for Mac, delivering an unbelievable listening experience.
Integrated KVM (keyboard, video and mouse) switching is another increasingly common inclusion in modern displays.
And beyond conventional computer display, television and mobile device markets, LCD suppliers are also partnering with their system-development customers to cultivate demand in additional new markets, such as baby and security monitors and smart displays. Take, for example, the portable monitor, the G-STORY GST56 shown below, I personally own and regularly use (again, I’m not a gamer, so overlook the stock photo’s screen):
It doesn’t contain its own battery; instead, it’s fueled by an external USB-C (or USB-A, via adapter) power source, which can (assuming sufficient current output) include the laptop computer its video input is simultaneously tethered to. G-STORY includes an array of bundled cables, along with a cover that doubles as the stand; I later added a sleeve to the mix. And check out the specs, keeping in mind that the GST56 only cost me $127.99 (on sale) in January:
- 165 Hz peak refresh rate
- 1 ms response time
- 6-inch IPS screen
- 1920×1080 pixel resolution
- 350cd/m² peak brightness
- 800:1 contrast ratio
- Built-in speakers plus headphone audio output jack
Here it is in action at Starbucks (I blurred both screens’ content post-photo capture for privacy):
Some other manufacturers’ conceptually similar displays, often referred to as monitor extenders, literally attach the supplemental LCD(s) to the laptop and its primary screen:
And then there are so-called “field monitors”, which HDMI or SD-tether to a still or video camera and provide a larger-screen supplement to the integrated analog or digital viewfinder and/or LCD screen. Sometimes, as with this example unit, the Ninja V from Atomos, a popular premium supplier, they even include an SSD or at least memory card slot(s), to provide a higher-capacity video recording alternative to the camera’s integrated storage:
In a recent writeup, a teardown of a LED light bulb with an integrated backup battery, I back-referenced an earlier post which had noted that as LED light bulb technology matured, manufacturers were variously differentiating their products in search of competitive isolation and profits. That writeup had similarly back-referenced an earlier piece on the evolution of Bluetooth-based peripherals, which contains this quote:
Such diversity within what’s seemingly a mature and “vanilla” product category is what prompted me to put cyber-pen to cyber-paper for this particular post. The surprising variety I encountered even during my brief period of research is reflective of the creativity inherent to you, the engineers who design these and countless other products. Kudos to you all!
That quote applies not only to Bluetooth audio adapters but also to LED light bulbs. And as this writeup hopefully gets across, to LCDs, too. Kudos, display developers! Let me know in the comments any additional cool LCD-derived products you’ve come across, or anything else LCD-related that you’d like to share.
—Brian Dipert is the Editor-in-Chief of the Edge AI and Vision Alliance, and a Senior Analyst at BDTI and Editor-in-Chief of InsideDSP, the company’s online newsletter.
Related Content
- Master of some: Direct-view-display technology
- Dissecting a battery-backed LED light bulb
- LED light bulb manufacturers diversify in search of sustainable profits
- Display technologies: Refinements and new entrants
The post LCDs: Evolutions, innovations, and differentiations stave off irrelevance appeared first on EDN.
An RF chipmaker’s quest to ride the AI bandwagon

Nordic Semiconductor, a supplier of Bluetooth and wireless IoT chips, has announced the acquisition of Atlazo, a San Diego, California-based startup specializing in always-on AI processors, sensor interface design, and energy management for tiny edge devices. Nordic will acquire Atlazo’s IP portfolio and the core team comprising eight engineers.
Atlazo, founded in 2016, has been targeting its on-device AI processor technology on the rapidly growing market for true wireless stereo (TWS) earbuds, hearing aids, wearables, and health monitoring devices. The company claims that its AI/ML technology allows a processor to operate at the lowest possible energy point for a given task.
Nordic plans to leverage this hyper-low-power AI/ML processor technology in its future wireless system-on-chips (SoCs). Moreover, by integrating Atlazo’s sensor technologies for health applications, Nordic can consolidate its chip designs for wearables and extend its reach to smart health devices serving optical heart monitoring and continuous glucose monitoring.
Nordic CEO Svenn-Tore Larsen calls this deal strategically significant despite being a small bolt-on acquisition. “We anticipate that we will begin to see the initial benefits of this acquisition within 12-18 months of closing the transaction.”
Besides the IP portfolio and engineering team, the deal includes certain other assets, including Atlazo’s office in San Diego, which will become Nordic’s third R&D site in the United States. The transaction, dependent on regulatory approval, is expected to be completed by the end of 2023.
Related Content
- Getting a Grasp on AI at the Edge
- Edge Intelligence Ticks Many Boxes for AI
- Adding Low-Power AI/ML Inference to Edge Devices
- Top 10 Processors for AI Acceleration at the Endpoint
- Nordic Set to Buy U.S. Startup Atlazo for AI Hardware IP
The post An RF chipmaker’s quest to ride the AI bandwagon appeared first on EDN.
NIST finalizes 3 algorithms for post-quantum cryptography

After selecting four cryptographic algorithms designed to withstand attack by quantum computers, the National Institute of Standards and Technology (NIST) has started the process of standardizing these algorithms. NIST has released draft standards for three of the four algorithms it selected in 2022, while the draft standard for FALCON, the fourth algorithm, will be released in about a year.
In other words, NIST is calling on the cryptographic community to provide feedback on the draft standards. It will accept feedback on its post-quantum cryptography standardization project until 22 November 2023.
Figure 1 NIST has been working with government, academia, and industry from around the world to develop a new set of encryption standards that will work with our current computers while being resistant to the quantum machines of the future.
Currently, sensitive electronic information—like email and wire transfer data—is protected using public-key encryption techniques based on mathematical models and cannot be broken down by a conventional computer. However, quantum computers, though still in their infancy, are sufficiently powerful to break these encryption mechanisms.
But while quantum computers are powerful enough to break current encryption algorithms, they don’t exist yet. So, security experts emphasize the need to plan ahead, partly because it takes years to integrate new algorithms across all computer systems.
Post-quantum cryptography
In 2016, when the idea of quantum computers started making waves, NIST began its efforts to develop quantum-resistant algorithms by calling cryptographic experts to submit candidate algorithms as part of its post-quantum cryptography standardization project. Cryptographic experts from dozens of countries submitted 69 eligible algorithms by the November 2017 deadline set by NIST.
Next, NIST released these 69 candidate algorithms for cryptographers to analyze and invited them to crack if possible. In an open and transparent process, NIST organized multiple rounds of evaluation to reduce the number of candidate algorithms.
In July 2022, NIST selected four algorithms for its Federal Information Processing Standard (FIPS) initiative. First, CRYSTALS-Kyber, covered in FIPS 203, is designed for general encryption purposes such as creating secure websites.
Second, CRYSTALS-Dilithium, covered in FIPS 204, is designed to protect the digital signatures we use when signing documents remotely. Third, SPHINCS+, covered in FIPS 205, is also designed for digital signatures. Fourth, FALCON, also designed for digital signatures, is slated to receive its own draft FIPS in 2024.
Figure 2 NIST announced the first four post-quantum cryptography algorithms based on structured lattices and hash functions, two families of math problems that could resist a quantum computer’s assault.
Multiple rounds of evaluation
According to Dustin Moody, a NIST mathematician and leader of the project, while these three algorithms will constitute the first group of post-quantum encryption standards NIST creates, they will not be the last. In fact, besides the four algorithms NIST selected in 2022, the project team has also selected a second set of algorithms for ongoing evaluation.
These algorithms are intended to augment the first set announced in 2022. NIST plans to publish draft standards in 2024 for any of these algorithms selected for standardization. “These additional algorithms are designed for general encryption, but they are based on different math problems than CRYSTALS-Kyber,” Moody said. “They will offer alternative defense methods should one of the selected algorithms show a weakness in the future.”
Moody said that it’s likely that there will be one or two additional algorithms. “For the moment, we are requesting feedback on the drafts,” he added. “Just in case we need to change anything or missed anything.”
Related Content
- Post-Quantum Cryptography: Are You Ready?
- NIST Hits Quantum Teleport Key Out of the Park
- The Future of Cryptography in Hardware Processors
- Hardware security entering quantum computing era
- Why cryptographers are worried about a post-quantum world
The post NIST finalizes 3 algorithms for post-quantum cryptography appeared first on EDN.
Spark taps Infineon for wireless charging module

The Yeti 500W industrial wireless charging module from Spark Connected employs Infineon’s dual-core PSoC Bluetooth Low Energy MCU for intelligent control. It also incorporates Infineon’s CoolGaN power devices for improved efficiency and reduced EMI.
Capable of delivering up to 500 W of power, the ready-to-integrate module can power and charge industrial machinery, autonomous mobile robots, automated guided vehicles, and light electric vehicles. With efficiency of over 93%, the Yeti 500W reduces power losses and eases thermal management. The charging module, which includes a transmitter and receiver, offers a misalignment tolerance of over 40 mm in all directions. As a result, reliable charging is ensured even if the orientation is not perfectly accurate.
Yeti 500W is compatible with all common battery types and voltages. Safety features include overvoltage and overcurrent protection to shield both the charger and connected equipment from potential damage or hazards. By eliminating the need for physical connectors and cables, the wireless charging system enables flexible charging, while enhancing efficiency and safety in industrial settings.
The Yeti 500W industrial wireless charging module will be available for customer sampling in Q3 2023.
Find more datasheets on products like this one at Datasheets.com, searchable by category, part #, description, manufacturer, and more.
The post Spark taps Infineon for wireless charging module appeared first on EDN.
Akoustis samples single-crystal BAW filter

Integrated device manufacturer Akoustis is sampling its single-crystal bulk acoustic wave (BAW) filter to a tier-1 enterprise-class Wi-Fi access-point customer. Aimed at next-generation enterprise Wi-Fi 7 architectures, the XBAW filter allows more precise filtering to capture stronger and more reliable signals.
According to Akoustis, it has redeveloped its single-crystal nanomaterials and created a new process that delivers BAW filters with higher power handling and improved harmonics compared to the company’s standard polycrystal materials. With the single-crystal XBAW filter, access-point makers will have the ability to design products with higher transmit power, enabling greater range and obstruction penetration. Improved harmonics increase the linearity of the power amplifier in the radio chain, reducing noise in the transmit path.
The tier-1 customer plans to use the XBAW filter in various Wi-Fi 7 routers that are slated to begin shipping in the first half of 2024. Currently, the routers are expected to employ 2×2 and 4×4 MU-MIMO architectures, with multiple XBAW filters per device.
A datasheet for the single-crystal XBAW filter was not available at the time of this announcement.
Find more datasheets on products like this one at Datasheets.com, searchable by category, part #, description, manufacturer, and more.
The post Akoustis samples single-crystal BAW filter appeared first on EDN.
Automotive SMD packs rectifier and TVS

A two-in-one surface-mount device, the R3T2FPHM3 from Vishay combines a rectifier and transient voltage suppressor (TVS) in a FlatPAK 5×6-mm package. The dual-chip device, which features an oxide planar chip junction and common cathode circuit configuration, saves PCB space and simplifies layouts.
With its 3-A, 600-V standard rectifier, 200-W TRANSZORB TVS, and wide operating temperature range of -55°C to +175°C, the R3T2FPHM3 is well-suited for automotive applications. It can be used for secondary protection of sensor units, distributed airbag modules, and low-power DC/DC converters in power distribution systems. When paired in series with a standard TVS, the R3T2FPHM3 offers a complete >24-V solution with a low clamping ratio.
The R3T2FPHM3’s rectifier has a low forward voltage drop of 0.86 V to reduce power losses and improve efficiency. Its TVS offers a breakdown voltage of 27 V. ESD protection complies with IEC 61000-4-2 standards in both air discharge mode and contact mode. The part also provides a moisture sensitivity level (MSL) of 1 per J-STD-020 requirements and a UL 94 V-0 flammability rating. Additionally, Vishay offers the device in AEC- Q101 qualified versions.
Samples and production quantities of the R3T2FPHM3 are available now, with lead times of 12 weeks.
Find more datasheets on products like this one at Datasheets.com, searchable by category, part #, description, manufacturer, and more.
The post Automotive SMD packs rectifier and TVS appeared first on EDN.
Security IP portfolio is tailored to FPGAs

A suite of security IP products from Rambus brings cryptographic, side-channel, and quantum safe protection to the FPGA market. Rambus says the security IP is designed to meet the unique needs of a broad range of FPGAs, from high-performance accelerators for generative AI to low-power, lightweight IoT devices.
The FPGA-targeted security IP products include root of trust, 800G MACsec, IPsec, and both classic and quantum safe public key encryption solutions. Leveraging differential power analysis (DPA) and fault injection attack (FIA) countermeasures, the security IP delivers high levels of protection against both cryptographic and side-channel attacks. With new quantum safe security IP products, Rambus aims to futureproof the protection of FPGAs for the coming Post Quantum Cryptography (PQC) era.
“As customer demand for security continues to accelerate, Rambus is dedicated to providing state-of-the-art security IP for the broad range of applications increasingly enabled by FPGAs,” said Neeraj Paliwal, general manager of Silicon IP at Rambus. “Our security IP solutions safeguard these FPGA devices now and in the future with quantum safe protection from PQC attacks.”
Explore the extensive range of hardware security IP products offered by Rambus by using the product page link below.
Find more datasheets on products like this one at Datasheets.com, searchable by category, part #, description, manufacturer, and more.
The post Security IP portfolio is tailored to FPGAs appeared first on EDN.