Feed aggregator

Amazon Sidewalk: The first STM32-qualified devices are already making a difference. Check out this customer testimonial!

ELE Times - Thu, 01/02/2025 - 12:52

For the first time, Nucleo boards housing an STM32WBA5 and an STM32WLx5 received the Amazon Sidewalk certification, thus guaranteeing these STM32 MCUs will offer robust integration, high efficiency, and trusted security when deployed on an Amazon Sidewalk network. We are even showing how Subeca, an end-to-end water management platform in the United States, leveraged these STM32 devices to obtain its Amazon Sidewalk qualification, thus ensuring its customers can benefit from this vast and secure network to create a cost-effective and scalable solution for water metering and pressure management IoT systems.

What is Amazon Sidewalk?

The idea behind Amazon Sidewalk is elegantly simple: using Internet-connected devices like Amazon Echos or some Ring Floodlight and Spotlight Cams, which serve as Amazon Sidewalk Bridges, to create a low-bandwidth and low-powered wireless network by piggybacking on a tiny amount of the Bridges’ bandwidth (80 Kbps). An Amazon Sidewalk device can thus connect to a Sidewalk Bridge using Bluetooth, securely connecting to its network and benefiting from the Internet. Moreover, once an Amazon Sidewalk end device is provisioned to the network via Bluetooth LE, it can rely on the long-range connectivity of the STM32WL5 to extend the network coverage over vast distances.

Amazon Sidewalk is free to use and simplifies operations. If a Sidewalk Bridge loses its Wi-Fi connection, Amazon’s technology can initiate a reconnection to the router without the user’s intervention. Bandwidth is also very low, and data usage is minimal and capped at 500 MB a month, meaning that even customers with a constrained Internet connection won’t feel its impact. Moreover, Amazon has numerous encryption and secure mechanisms to keep data private and safe. Hence, it’s possible to use Amazon Sidewalk for logistic, personal, or pet tracking, beyond-the-fence asset monitoring, smart irrigation systems, healthcare monitoring, or, as Subeca demonstrates, for more demanding applications like utilities monitoring on a national scale, as the Sidewalk coverage map suggests.

Amazon Sidewalk in one imageAmazon Sidewalk in one image What the Amazon Sidewalk qualification means for STM32 developers The hardware

As of today, boards featuring the STM32WBA5STM32WL5, and STM32WLE5 have received the Amazon Sidewalk qualification. The STM32WBA5 offers a Cortex-M33, a Bluetooth LE 5.4 transceiver, and can target a SESIP Level 3 certification, while the STM32WLx5 devices use a Cortex-M4 and a sub-GHz radio. Engineers might choose an STM32WBA55 and an STM32WLE5 to optimize memory usage or an STM32WBA55 and an STM32WL55 for the greater flexibility this configuration affords.

Concretely, the STM32WBA5 talks directly to the Amazon Sidewalk Bridge using a Bluetooth LE connection. And in some instances, that’s all the system needs. However, when networking multiple end nodes over large distances, like in the case of Subeca, it’s necessary to use the STM32WL5 to talk to devices using CSS (Chirp Spread Spectrum, such as LoRa) or an FSK modulation, depending on the distance and frequency range engineers wish to target.

Amazon Sidewalk qualified hardware configurationsAmazon Sidewalk qualified hardware configurations The software An STM32WBA55 development boardAn STM32WBA55 development board

To help developers jumpstart their projects, ST is offering software packages that help implement a network stack that easily interacts with Amazon Sidewalk. This dramatically simplifies the connection to the network, the integration of security features into the application, and the onboarding process. Put simply, while an Amazon Sidewalk guarantees that ST devices will provide the reliability and safety required, it is also a testament to our partnership with Amazon and our desire to help engineers take advantage of this technology.

Real-world applications

The qualification and partnership between Amazon and ST means that partners like Subeca can focus on showcasing their expertise and distinguishing their products from the competition instead of spending resources solving networking challenges. As Patrick Keaney, CEO of Subeca, explained,

“Our focus is on innovating and simplifying solutions that solve real-world challenges in the water market. We believe technology like advanced metering, leak detection, and pressure monitoring should be available to all water utilities everywhere, regardless of size. That means wireless connectivity is a must. ST’s STM32WBA5 and STM32WL5/STM32WLE5 wireless microcontrollers enabled us to bring our first Amazon sidewalk-qualified products to the market with great architectural flexibility, performance, low-power consumption in a cost-effective manner with meaningful device longevity and robust and resilient supply chain. Leveraging ST’s expansive device portfolio and ecosystem coupled with great technical support, ST offered us quality technical ingredients, ease-of-use, and portability required to transform our vision into reality.”

A NUCLEO-WL55JC1A NUCLEO-WL55JC1

Avnet also showcased an Amazon Sidewalk demo at AWS Re:Invent 2024 featuring an STM32WBA5, an STM32WL55, and Avnet’s IoTConnect platform to handle the onboarding, device management, and data integration with AWS. AVnet’s solution is often a darling at ST Technology Tours because it vastly simplifies the creation of IoT systems by handling some of the most complex development operations. Put simply, the demo is one of the best examples of how ST, Amazon Sidewalk, and a member of the ST Partner Program can come together to make a difference in the operations of a company trying to take part in the IoT revolution.

Why it matters?

Interconnecting a myriad of small devices to each other and the Internet has always been the IoT dream. The challenge is that building a new infrastructure from scratch is expensive, and without massive adoption, it will never reach critical mass. Amazon Sidewalk solves this issue by utilizing existing Echo devices and other Bridges connected to a router. By simply leveraging existing installations, the network is already in place. And by enabling product makers and customers to use it for free, it significantly lowers the barrier to entry.

Additionally, Amazon Sidewalk handles a lot of the complexities associated with such a network, from security to over-the-air updates. That’s why Amazon instituted a qualification program. To protect all participants in this ecosystem, Amazon authorizes devices to connect to its network. It also explains the company’s certification program. By qualifying STM32 microcontrollers, Amazon ensures that its partners use trusted devices that will run the network stack reliably and implement security features according to strict standards.

The post Amazon Sidewalk: The first STM32-qualified devices are already making a difference. Check out this customer testimonial! appeared first on ELE Times.

PWM power DAC incorporates an LM317

EDN Network - Wed, 01/01/2025 - 16:57

Instead of the conventional approach of backing up a DAC with an amplifier to boost output, this design idea charts a less traveled by path to power. It integrates an LM317 positive regulator with a simple 8-bit PWM DAC topology to obtain a robust 11-V, 1.5-A capability. It thus preserves simplicity while exploiting the built-in fault protection features (thermal and overload) of that time proven Bob Pease masterpiece. Its output is proportional to the guaranteed 2% precision of the LM317 internal voltage reference, making it securely independent of vagaries of both the 5-V logic supply rail and incoming raw DC supply.

Wow the engineering world with your unique design: Design Ideas Submission Guide

Figure 1 diagrams how it works.

Figure 1 LM317 regulator melds with HC4053 CMOS switch to make a 16-W PWM power DAC.

CMOS SPDT switches U1b and U1c accept a 10-kHz PWM signal to generate a 0 V to 9.75 V “ADJ” control signal for the U2 regulator via feedback networks R1, 2, and R3. The incoming PWM signal is AC coupled so that U1 can “float” on U2’s output. U1c provides an inverse of the PWM signal, implementing active ripple cancellation as described in “Cancel PWM DAC ripple with analog subtraction.” Note that R1||R2 = R3 to optimize ripple subtraction and DAC accuracy.

This feedback arrangement does, however, make the output voltage a nonlinear function of PWM duty factor (DF) as given by:

Vout = 1.25 / (1 – DF(1 – R1/(R1 + R2))
= 1.25 / (1 – 0.885*DF)

This is graphed in Figure 2. 

Figure 2 The Vout (1.25 V to 11 V) versus PWM DF (0 to 1) where Vout = 1.25 / (1 – 0.885*DF).

Figure 3 plots the inverse of Figure 2, yielding the PWM DF required for any given Vout.

Figure 3 The inverse of Figure 2 where PWM DF = (1 – 1.25/Vout)/0.885.

The corresponding 8-bit PWM setting works out to: Dbyte = 255 (1 – 1.25 / Vout) / 0.885

Vfullscale = 1.25 / (R1/(R1 + R2)), so design choices other than 11 V are available. 11 V is the maximum consistent with HC4053’s ratings, but up to 20 V is feasible if the metal gate CD4053B is substituted for U1. Don’t forget, however, the requirement that R3 = R1||R2.

The supply rail V+ can be anything from a minimum of Vfullscale+3V to accommodate U2’s minimum headroom dropout requirement, up to the LM317’s absmax 40-V limit. DAC accuracy will be unaffected due to this chip’s excellent PSRR, although of course efficiency may suffer.

U2 should be heatsunk as dictated by heat dissipation caused by required output currents multiplied by the V- to Vout differential. Up to double-digit watts is possible at high currents and low Vout.

Stephen Woodward’s relationship with EDN’s DI column goes back quite a long way. Over 100 submissions have been accepted since his first contribution back in 1974.

Related Content

googletag.cmd.push(function() { googletag.display('div-gpt-ad-native'); }); -->

The post PWM power DAC incorporates an LM317 appeared first on EDN.

2024: A year’s worth of interconnected themes galore

EDN Network - Wed, 01/01/2025 - 16:54

As any of you who’ve already seen my precursor “2025 Look Ahead” piece may remember, we’ve intentionally flipped the ordering of my two end-of-year writeups once again this year. This time, I’ll be looking back over 2024: for historical perspective, here are my prior retrospectives for 2019, 2021, 2022 and 2023 (we skipped 2020).

As I’ve done in past years, I thought I’d start by scoring the topics I wrote about a year ago in forecasting the year to come:

  • Increasingly unpredictable geopolitical tensions
  • The 2024 United States election
  • Windows (and Linux) on Arm
  • Declining smartphone demand, and
  • Internal and external interface evolutions

Maybe I’m just biased but I think I nailed ‘em all, albeit with varying degrees of impactfulness. To clarify, by the way, it’s not that if the second one would happen was difficult to predict; the outcome, which I discussed a month back, is what was unclear at the time. In the sections that follow, I’m going to elaborate on one of the above themes, as well as discuss other topics that didn’t make my year-ago forecast but ended up being particularly notable (IMHO, of course).

Battery transformations

I’ve admittedly written quite a lot about lithium-based batteries and the devices they fuel over the past year, as I suspect I’ll also be doing in the year(s) to come. Why? My introductory sentence to a recent teardown of a “vape” device answers that question, I think:

The ever-increasing prevalence of lithium-based batteries in various shapes, sizes and capacities is creating a so-called “virtuous circle”, leading to lower unit costs and higher unit volumes which encourage increasing usage (both in brand new applications and existing ones, the latter as a replacement for precursor battery technologies), translating into even lower unit costs and higher unit volumes that…round and round it goes.

Call me simple-minded (as some of you already may have done a time or few over the years!) but I consistently consult the same list of characteristics and tradeoffs among them when evaluating various battery technologies…a list that was admittedly around half its eventual length when I first scribbled it on a piece of scrap paper a few days ago, until I kept thinking of more things to add in the process of keyboard-transcribing it (thereby eventually encouraging me to delete the “concise” adjective I’d originally used to describe it)!

  • Volume manufacturing availability, translating to cost (as I allude to in the earlier quote)
  • Form factor implementation flexibility (or not)
  • The required dimensions and weight for a given amount of charge-storage capacity
  • Both peak and sustained power output
  • The environmental impacts of raw materials procurement, battery manufacturing, and eventual disposal (or, ideally, recycling)
  • Speaking of “environmental”, the usable operating temperature range, along with tolerance to other environment variables such as humidity, shock and vibration
  • And recharge speed (both to “100% full” and to application-meaningful percentages of that total), along with the number of recharge cycles the battery can endure until it no longer can hold enough anode electrons to be application-usable in a practical sense.

Although plenty of lithium battery-based laptops, smartphones and the like are sold today, a notable “driver” of incremental usage growth in the first half of this decade (and beyond) has been various mobility systems—battery-powered drones (and, likely in the future, eVTOLs), automobiles and other vehicles, untethered robots, and watercraft (several examples of which I’ll further elaborate on later in this writeup, for a different reason). Here, the design challenges are quite interconnected and otherwise complex, as I discussed back in October 2021:

Li-ion battery technology is pretty mature at this point, as is electric motor technology, so in the absence of a fundamental high-volume technology breakthrough in the future, to get longer flight time, you need to include bigger batteries…which leads to what I find most fundamentally fascinating about drones and their flying kin: the fundamental balancing act of trading off various contending design factors that is unique to the craft of engineering (versus, for example, pure R&D or science). Look at what I’ve just said. Everyone wants to be able to fly their drone as long as possible, before needing to land and swap out battery packs. But in order to do so, that means that the drone manufacturer needs to include larger battery cells, and more of them.

Added bulk admittedly has the side benefit of making the drone more tolerant of wind gusts, for example, but fundamentally, the heavier the drone the beefier the motors need to be in order to lift it off the ground and fly it for meaningful altitudes, distances, and durations. Beefier motors burn more juice, which begs for more batteries, which make the drone even heavier…see the quagmire? And unlike with earth-tethered electricity-powered devices, you can’t just “pull over to the side of the road” if the batteries die on you.

Now toss in the added “twist” that everyone also wants their drone to be as intelligent as possible so it doesn’t end up lost or tangled in branches, and so it can automatically follow whatever’s being videoed. All those image and other sensors, along with the intelligence (and memory, and..) to process the data coming off them, burns juice, too. And don’t forget about the wireless connectivity between the drone and the user—minimally used for remote control and analytics feedback to the user…How do you balance all of those contending factors to come up with an optimum implementation for your target market?

Although the previous excerpt was specifically about drones, many of the points I raised are also relevant at least to a degree in the other mobility applications I mentioned. That said, an electric car’s powerplant size and weight constraints aren’t quite as acute as an airborne system’s might be, for example. This application-defined characteristics variability, both in an absolute sense and relative to others on my earlier list, helps explain why, as Wikipedia points out, “there are at least 12 different chemistries of Li-ion batteries” (with more to come). To wit, developers are testing out a diversity of both anode and cathode materials (and combinations of them), increasingly aided by AI (which I’ll also talk more about later in this piece) in the process, along with striving to migrate away from “wet” electrolytes, which among other things are flammable and prone to leakage, toward safer solid-state approaches.

Another emerging volume growth application, as I highlighted throughout the year, are battery generators, most frequently showcased by me in their compact portable variants. Here, while form factor and weight remain important, since the devices need to be hauled around by their owners, they’re stationary while in use. Extrapolate further and you end up with even larger home battery-backup banks that never get moved once installed. And extrapolate even further, to a significant degree in fact, and you’re now talking about backup power units for hospitals, for example, or even electrical grid storage for entire communities or regions. One compelling use case is to smooth out the inherent availability variability of renewable energy sources such as solar and wind, among other reasons to “feed” the seemingly insatiable appetites of AI workload-processing data centers in a “green”-as-possible manner.  And in all these stationary-backup scenarios, installation space is comparatively abundant and weight is also of lesser concern; the primary selection criteria are factors such as cost, invulnerability, and longevity.

As such, non-lithium-based technologies will likely become increasingly prominent in the years to come. Sodium-ion batteries (courtesy of, in part, sodium’s familial proximity to lithium in the Periodic Table of Elements) are particularly near-term promising; you can already buy them on Amazon! The first US-based sodium-ion “gigafactory” was recently announced, as was the US Department of Energy’s planned $3 billion in funding for new sodium-ion (and other) battery R&D projects. Iron-based batteries such as the mysteriously named (but not so mysterious once you learn how they work) iron-air technology tout raw materials abundance (how often do you come across rust, after all?) translating into low cost. Vanadium-based “flow” batteries also hold notable promise. And there’s one other grid-scale energy storage candidate with an interesting twist: old EV batteries. They may no longer be sufficiently robust to reliably power a moving vehicle, but stationary backup systems still provide a resurrecting life-extension opportunity.

For ongoing information on this topic, in addition to my and colleagues’ periodic coverage, market research firm IDTechEx regularly publishes blog posts on various battery technology developments which I also commend to your inspection. I have no connection with the firm aside from being a contented consumer of their ongoing information output!

Drones as armaments

As a kid, I was intrigued by the history of warfare. Not (at all) the maiming, killing and other destruction aspects, mind you, instead the equipment and its underlying technologies, their use in conflicts, and their evolutions over time. Three related trends that I repeatedly noticed were:

  1. Technologies being introduced in one conflict and subsequently optimized (or in other cases disbanded) based on those initial experiences, with the “success stories” then achieving widespread use in subsequent conflicts
  2. The oft-profound advantages that adopters of new successful warfare technologies (and equipment and techniques based on them) gained over less-advanced adversaries who were still employing prior-generation approaches, and
  3. That new technology and equipment breakthroughs often rapidly obsoleted prior-generation warfare methods

Re point #1, off the top of my head, there’s (with upfront apologies for any United States centricity in the examples that follow):

  • Chemical warfare, considered (and briefly experimented with) during the US Civil War, with widespread adoption beginning in World War I (WWI)
  • Airplanes and tanks, introduced in WWI and extensively leveraged in WWII (and beyond)
  • Radar (airplanes), sonar (submarines) and other electronic surveillance, initially used in WWII with broader implementation in subsequent wars and other conflicts
  • And RF and other electronics-based communications methods, including cryptography (and cracking), once again initiated in WWII

And to closely related points #2 and #3, two WWII examples come to mind:

  • I still vividly recall reading as a kid about how the Polish army strove, armed with nothing but horse cavalry, to defend against invading German armored brigades, although the veracity of at least some aspects of this propaganda-tainted story are now in dispute.
  • And then there was France’s Maginot Line, a costly “line of concrete fortifications, obstacles and weapon installations built by France in the 1930s” ostensibly to deter post-WWI aggression by Germany. It was “impervious to most forms of attack” across the two countries’ shared border, but the Germans instead “invaded through the Low Countries in 1940, passing it to the north”. As Wikipedia further explains, “The line, which was supposed to be fully extended further towards the west to avoid such an occurrence, was finally scaled back in response to demands from Belgium. Indeed, Belgium feared it would be sacrificed in the event of another German invasion. The line has since become a metaphor for expensive efforts that offer a false sense of security.”

I repeatedly think of case studies like these as I read about how the Ukrainian armed forces are, both in the air and sea, now using innovative, often consumer electronics-sourced approaches to defend against invading Russia and its (initially, at least) legacy warfare techniques. Airborne drones (more generally: UAVs, or unmanned aerial vehicles) have been used for surveillance purposes since at least the Vietnam War as alternatives to satellites, balloons, manned aircraft and the like. And beginning with aircraft such as the mid-1990s Predator, UAVs were also able to carry and fire missiles and other munitions. But such platforms were not only large and costly, but also remotely controlled, not autonomous to any notable degree. And they weren’t in and of themselves weapons.

That’s all changed in Ukraine (and elsewhere, for that matter) in the modern era. In part hamstrung by its allies’ constraints on what missiles and other weapons it was given access to and how and where they could be used, Ukraine has broadened drones’ usage beyond surveillance into innate weaponry, loading them up with explosives and often flying them hundreds of miles for subsequent detonation, including all the way to Moscow. Initially, Ukraine retrofit consumer drones sourced from elsewhere, but it now manufactures its own UAVs in high volumes. Compared to their Predator precursors, they’re compact, lightweight, low cost and rugged. They’re increasingly autonomous, in part to counteract Russian jamming of wireless control signals coming from their remote operators. They can even act as flamethrowers. And as the image shown at the beginning of this section suggests, they not only fly but also float, a key factor in Ukraine’s to-date success both in preventing a Russian blockade of the Black Sea and in attacking Russia’s fleet based in Crimea.

AI (again, and again, and…)

AI has rapidly grown beyond its technology-coverage origins and into the daily clickbait headlines and chyrons of even mainstream media outlets. So it’s probably no surprise that this particular TLA (with “T” standing for “two” this time, versus the the usual) is a regular presence in both my end-of-year and next-year-forecast writeups, along with plenty of ongoing additional AI coverage in-between each year’s content endpoints. A month ago, for example, I strove to convince you that multimodal AI would be ascendant in the year(s) to come. Twelve months ago, I noted the increasing importance of multimodal models’ large language model (LLM) precursors over the prior year, and the month(-ish) before that, I’d forecasted that generative AI would be a big deal in 2023 and beyond. Lather, rinse and repeat.

What about the past twelve months; what are the highlights? I could easily “write a book” on just this topic (as I admittedly almost already did earlier re “Battery Transformations”). But with the 3,000-word count threshold looming, and always mindful of Aalyia’s wrath (I kid…maybe…), I’ll strive to practice restraint in what follows. I’m not, for example, going to dwell on OpenAI’s start-of-year management chaos and ongoing key-employee-shedding, nor on copyright-infringement lawsuits brought against it and its competitors by various content-rights owners…or for that matter, on lawsuits brought against it and its competitors (and partners) by other competitors. Instead, here’s some of what else caught my eye over the past year:

  • Deep learning models are becoming more bloated with the passage of time, despite floating point-to-integer conversion, quantization, sparsity and other techniques for trimming their size. Among other issues, this makes it increasingly infeasible to run them natively (and solely) on edge devices such as smartphones, security cameras and (yikes!) autonomous vehicles. Imagine (a theoretical case study, mind you) being unable to avoid a collision because your car’s deep learning model is too dinky to cover all possible edge and corner cases and a cloud-housed supplement couldn’t respond in time due to server processing and network latency-and-bandwidth induced delays…
  • As the models themselves grow, the amount of processing horsepower (not to mention consumed power) and time needed to train them increases as well…exponentially so.
  • Resource demands for deep learning inference are also skyrocketing, especially as the trained models referenced become more multimodal and otherwise complex, not to mention the new data the inference process is tasked with analyzing.
  • And semiconductor supplier NVIDIA today remains the primary source of processing silicon for training, along with (to a lesser but still notable market segment share degree) inference. To the company’s credit, decades after kicking off its advocacy of general-purpose graphics processing (GPGPU) applications, its longstanding time, money and headcount investments have borne big-time fruit for the company. That said, competitors (encouraged by customers aspiring for favorable multi-source availability and pricing outcomes) continue their pursuit of the “Green Team”.
  • To my earlier “consumed power” comments, along with my even earlier “seemingly insatiable appetites of AI workload-processing data centers” comments, and as my colleague (and former boss) Bill Schweber also recently noted, “AI-driven datacenter energy demand could expand 160 percent over the next two years, leaving 40 percent of existing facilities operationally constrained by power availability,” to quote recent coverage in The Register. In response to this looming and troubling situation, in the last few days alone I’ve come across news regarding Amazon (“Amazon AI Data Centers To Double as Carbon Capture Machines”) and Meta (“Meta wants to use nuclear power for its data centers”). Plenty of other recent examples exist. But will they arrive in time? And will they only accelerate today’s already worrying global warming pace in the process?
  • But, in spite of all of this spiraling “heavy lifting”, researchers continue to conclude that AI still doesn’t have a coherent understanding of the world, not to mention that the ROI on ongoing investments in what AI can do may be starting to level off (at least to some observers, albeit not a universally held opinion).
  • One final opinion; deep learning models are seemingly already becoming commodities, a trend aided in part by increasingly capable “open” options (although just what “open” means has no shortage of associated controversy). If I’m someone like Amazon, Apple, Google, Meta or Microsoft, whose deep learning investments reap returns in associated AI-based services and whose models are “buried” within these services, this trend isn’t so problematic. Conversely, however, for someone whose core business is in developing and licensing models to others, the long-term prognosis may be less optimistic, no matter how rosy (albeit unprofitably so) things may currently seem to be. Heck, even AMD and NVIDIA are releasing open model suites of their own nowadays…
Auld Lang Syne

I’m writing this in early December 2024. You’ll presumably be reading it sometime in January 2025. I’ll split the difference and wrap up by first wishing you all a Happy New Year! 😉

As usual, I originally planned to cover a number of additional topics in this piece, such as (in no particular order save for how they came out of my noggin):

But (also) as usual I ended up with more things that I wanted to write about than I had a reasonable wordcount budget to do so. Having now passed through 3,000 words, I’m going to restrain myself and wrap up, saving the additional topics (as well as updates on the ones I’ve explored here) for dedicated blog posts to come in the coming year(s). Let me know your thoughts on my top-topic selections, as well as what your list would have looked like, in the comments!

Brian Dipert is the Editor-in-Chief of the Edge AI and Vision Alliance, and a Senior Analyst at BDTI and Editor-in-Chief of InsideDSP, the company’s online newsletter.

Related Content

googletag.cmd.push(function() { googletag.display('div-gpt-ad-inread'); });
googletag.cmd.push(function() { googletag.display('div-gpt-ad-native'); }); -->

The post 2024: A year’s worth of interconnected themes galore appeared first on EDN.

Hack - converted a passive 40w sub woofer into a powered Bluetooth & Aux, thrift store finds

Reddit:Electronics - Tue, 12/31/2024 - 21:31
Hack - converted a passive 40w sub woofer into a powered Bluetooth & Aux, thrift store finds

Found at thrift store, passive sub woofer for 2$ and a small powered Bluetooth & Aux speaker system that was “broken” all for 10$

Crab Rave sounds real good too !!!

submitted by /u/SirGreybush
[link] [comments]

Ternary gain-switching 101 (or 10202, in base 3)

EDN Network - Tue, 12/31/2024 - 16:54

This design idea is centered on the humble on/off/on toggle switch, which is great for selecting something/nothing/something else, but can be frustrating when three active options are needed. One possibility is to use the contacts to connect extra, parallel resistors across a permanent one (for example), but the effect is something like low/high/medium, which just looks wrong.

That word “active” is the clue to making the otherwise idle center position do some proper work, like helping to control an op-amp stage’s gain, as shown in Figure 1.

Figure 1 An on/off/on switch gives three gain settings in a non-inverting amplifier stage and does so in a rational order.

Wow the engineering world with your unique design: Design Ideas Submission Guide

I’ve used this principle many times, but can’t recall having seen it in published circuits, and think it’s novel, though it may be so commonplace as to be invisible. It’s certainly obvious when you think about it.

A practical application

That’s the basic idea, but it’s always more satisfying to convert such ideas into something useful. Figure 2 illustrates just that: an audio gain-box whose amplification is switched in a ternary sequence to give precise 1-dB steps from 0 to +26 dBs. As built, it makes a useful bit of lab kit.

Figure 2 Ternary switching over three stages gives 0–26 dB gain in precise 1-dB steps.

Three gain stages are concatenated, each having its own switch. C1 and C2 isolate any DC, and R1 and R12 are “anti-click” resistors, ensuring that there’s no stray voltage on the input or output when something gets plugged in. A1d is the usual rail-splitter, allowing use on a single, isolated supply.

The op-amps are shown as common-or-garden TL074/084s. For lower noise and distortion, (a pair of) LM4562s would be better, though they take a lot more current. With a 5-V supply, the MCP6024 is a good choice. For stereo use, just duplicate almost everything and use double-pole switches.

All resistor values are E12/24 for convenience. The resistor combinations shown are much closer to the ideal, calculated values than the assumed 1% tolerance of actual parts, and give a better match than E96s would in the same positions.

Other variations on the theme

The circuit of Figure 2 could also be built for DC use but would then need low-offset op-amps, especially in the last stage. (Omit C1, C2, and other I/O oddments, obviously.)

Figure 1 showed the non-inverting version, and Figure 3 now employs the idea in an inverting configuration. Beware of noise pick-up at the virtual-earth point, the op-amp’s inverting input.

Figure 3 An inverting amplifier stage using the same switching principle.

The same scheme can also be used to make an attenuator, and a basic stage is sketched in Figure 4. Its input resistance changes depending on the switch setting, so an input buffer is probably necessary; buffering between stages and of the output certainly is.

Figure 4 A single attenuation stage with three switchable levels.

Conclusion: back to binary basics

You’ve probably been wondering, “What’s wrong with binary switching?” Not a lot, except that it uses more op-amps and more switches while being rather obvious and hence less fun.

Anyway, here (Figure 5) is a good basic circuit to do just that.

Figure 5 Binary switching of gain from 0 to +31 dB, using power-of-2 steps. Again, the theoretical resistor values are much closer to the ideal than their actual 1% tolerances.

Nick Cornford built his first crystal set at 10, and since then has designed professional audio equipment, many datacomm products, and technical security kit. He has at last retired. Mostly. Sort of.

Related Content

googletag.cmd.push(function() { googletag.display('div-gpt-ad-native'); }); -->

The post Ternary gain-switching 101 (or 10202, in base 3) appeared first on EDN.

Wave Soldering Definition, Process, Working, Uses & Advantages

ELE Times - Tue, 12/31/2024 - 14:09

Wave soldering is a highly effective method used in the electronics manufacturing industry to solder components onto printed circuit boards (PCBs). This process involves conveying the PCB over a wave of molten solder, enabling the formation of robust and dependable solder connections in an efficient manner. Below, we explore the intricacies of wave soldering, its process, applications, advantages, and disadvantages.

What is Wave Soldering?

Wave soldering is a bulk soldering process primarily used for soldering through-hole and some surface-mount components on PCBs. It is named after the “wave” of molten solder that contacts the board’s underside. This process ensures that all solder joints are formed simultaneously, making it ideal for high-volume production environments.

The technique is well-suited for double-sided PCBs where components are mounted on both sides, although it is primarily used for through-hole components. Wave soldering is a preferred choice for its speed, consistency, and ability to handle complex boards.

How Wave Soldering Works

The wave soldering process consists of several key stages:

  1. Fluxing:
    • Flux is applied to the PCB to clean and prepare the surfaces for soldering. Flux removes oxidation from the component leads and pads, ensuring proper adhesion of solder.
    • It also protects the components and pads from oxidation during the soldering process.
  2. Preheating:
    • The PCB is preheated to prevent thermal shock and to activate the flux. This step ensures that the board and components reach a suitable temperature for soldering.
    • Preheating also reduces the risk of warping and ensures consistent solder flow.
  3. Solder Wave Contact:
    • The preheated PCB is transported across a controlled wave of molten solder, which is continuously generated by a pump within the soldering equipment.
    • The wave ensures solder adheres to the exposed metal surfaces of the PCB, forming solder joints for all components simultaneously.
  4. Cooling:
    • After soldering, the PCB undergoes a cooling phase to allow the solder joints to solidify, ensuring that the components are firmly and securely affixed to the board.
    • Proper cooling minimizes defects like solder cracks.
Wave Soldering Process
  1. Preparation:
    • Verify that the PCB is thoroughly cleaned and devoid of any contaminants.
    • Place all components precisely in their designated positions on the PCB.
  2. Machine Setup:
    • Configure the soldering machine, including temperature settings, conveyor speed, and wave height.
    • Confirm that the solder pot contains the correct solder alloy suitable for the specific application.
  3. Flux Application:
    • Apply flux evenly across the PCB. Automated fluxing systems are often used for consistency.
  4. Preheating:
    • Pass the PCB through a preheating zone to gradually raise its temperature.
  5. Soldering:
    • The PCB moves across the solder wave, where molten solder bonds with the exposed metal areas, forming strong connections for all components.
  6. Post-Soldering Inspection:
    • Optical systems designed for automated inspections are frequently employed to identify defects such as solder bridges, cold joints, or voids.
  7. Cleaning (Optional):
    • Remove any residual flux if necessary, using cleaning agents or specialized equipment.
Uses & Applications of Wave Soldering

Wave soldering is extensively used in various industries for high-volume PCB production. Common applications include:

  • Consumer Electronics:
    • Manufacturing devices like televisions, radios, and home appliances.
  • Automotive Electronics:
    • Producing PCBs for car dashboards, sensors, and control units.
  • Telecommunications:
    • Creating PCBs for routers, switches, and telecommunication equipment.
  • Industrial Electronics:
    • Manufacturing control systems, power supplies, and industrial automation equipment.
  • Medical Devices:
    • Soldering PCBs for medical monitoring devices, imaging equipment, and diagnostic tools.
Advantages of Wave Soldering
  1. High Efficiency:
    • Wave soldering is ideal for mass production due to its speed and ability to solder multiple joints simultaneously.
  2. Consistency:
    • The process ensures uniform solder joints, reducing variability and defects.
  3. Cost-Effective:
    • Mass soldering significantly lowers labor expenses and minimizes material wastage.
  4. Compatibility with Through-Hole Components:
    • Wave soldering excels at soldering through-hole components, which are difficult to solder using other techniques.
  5. Automation-Friendly:
    • The process can be fully automated, minimizing manual intervention and increasing productivity.
Disadvantages of Wave Soldering
  1. Limited Surface-Mount Compatibility:
    • While wave soldering can handle some surface-mount devices (SMDs), it is less effective for densely populated PCBs designed for reflow soldering.
  2. Defects:
    • Issues such as solder bridging, voids, and insufficient solder can occur if the process parameters are not well-controlled.
  3. Thermal Stress:
    • Components and PCBs can be damaged by excessive heat if preheating and soldering temperatures are not optimized.
  4. Environmental Concerns:
    • Lead-based solders and flux chemicals used in wave soldering can pose environmental and health risks. Lead-free alternatives mitigate these issues but may require higher temperatures.
  5. Complex Setup:
    • Setting up and maintaining wave soldering machines can be complex and requires skilled operators.
Conclusion

Wave soldering remains a critical process in PCB manufacturing, especially for through-hole technology. Its ability to solder large volumes of components quickly and consistently makes it indispensable in industries requiring mass production. However, with the increasing prevalence of surface-mount technology and miniaturized PCBs, alternative methods like reflow soldering are becoming more prominent.

Understanding the wave soldering process, its advantages, and limitations enables manufacturers to optimize their production processes and achieve high-quality results. As the industry continues to evolve, wave soldering will remain a valuable tool in the electronics manufacturing arsenal.

The post Wave Soldering Definition, Process, Working, Uses & Advantages appeared first on ELE Times.

IntelligentElectronicComponentSearch

Reddit:Electronics - Tue, 12/31/2024 - 14:03

Hi, I’ve uploaded a tool I created to GitHub for searching electronic component prices across some of the major marketplaces (currently Mouser, DigiKey, Farnell, and TME).

It’s a Python program with a pretty simple interface that uses the APIs of these distributors to fetch prices for any components you search. Essentially, you enter the part number and the desired quantity, and the tool queries the APIs of different stores to give you real-time info on pricing, stock, and availability.

In addition, it has a feature that generates an optimized shopping list in .xlsx format, helping you figure out which store to buy each component from to minimize costs. You can configure which markets to query and input your own API keys for each one.

I know there are some alternatives out there, but I find it really convenient to have something like this for myself, and the best part is that it relies on the official APIs of each distributor. Plus, since it’s all local, you have full control over your data.

I’d love to hear your thoughts on it, and if you think there’s something I could improve or add. Any feedback is welcome!

Thanks, and I hope it’s helpful! 🙌

https://github.com/MarvinTechLab/IntelligentElectronicComponentSearch

submitted by /u/M4rv1n_09_
[link] [comments]

Netherlands launches ChipNL Competence Centre to drive semiconductor innovation

Semiconductor today - Tue, 12/31/2024 - 12:53
The ChipNL Competence Centre has been launched, aiming to advance technological capabilities and foster innovation in the Dutch semiconductor ecosystem. The European Commission and the Rijksdienst voor Ondernemend Nederland (RVO) have jointly allocated a budget of €12m for the next four years. Specifically, the four-year project aims to stimulate innovation, collaboration and talent development and to enhance global competitiveness of the Netherlands and Europe in semiconductor technologies...

A Bluetooth receiver, an identity deceiver

EDN Network - Mon, 12/30/2024 - 18:06

In mid-October 2015, EDN ran my teardown of Logitech’s Bluetooth Audio Adapter (a receiver, to be precise) based on a CSR (now Qualcomm) BlueCore CSR8630 Single Chip Audio ROM.

The CSR module covers the bulk of the bottom half of the PCB topside, with most of the top half devoted to discretes and such for implementing the audio line-level output amp and the like:

A couple of weeks later, in a follow-up blog post, I mentioned (and briefly compared) a bunch of other Bluetooth adapters I’d come across. Some acted as both receivers and transmitters, for example, while others embedded batteries for portable usage. They implemented varying Bluetooth profiles and specification levels, and some even supported aptX and other optional audio codecs. Among them were three different Aukey models; here’s what I said about them:

I recently saw Aukey’s BR-C1 on sale for $12.99, for example (both black and white color scheme options are available), while the BR-C2 was recently selling for $1 less, and the even fuller-featured BT-C2 was recently special-priced at $24.99.

Logitech’s device is AC-powered via an included “wall wart” intermediary and therefore appropriate for adding Bluetooth input-source capabilities to an A/V receiver, as discussed in my teardown. Aukey’s products conversely contain built-in rechargeable batteries and are therefore primarily intended for mobile use, such as converting a conventional pair of headphones into wireless units. Recharging of the Aukey devices’ batteries occurs via an included micro-USB cable and not-included 5V USB-output power source.

All of the Aukey products can also act as hands-free adapters, by virtue of their built-in microphones. The BR-C1 and BR-C2’s analog audio connections are output-only, thereby classifying them as Bluetooth receivers; the more expensive BT-C2 is both a Bluetooth transmitter and receiver (albeit not both at the same time). But the Bluetooth link between all of them and a wirelessly tethered device is bi-directional, enabling not only speakerphone integration with a vehicle audio subsystem or set of headphones (via analog outputs) but also two-way connectivity to a smartphone (via Bluetooth).

The fundamental difference between the BR-C1 and BR-C2, as far as I can tell, is the form factor; the BR-C1 is 2.17×2.17×0.67 inches in size, while the BR-C2 is 2×1×0.45 inches. All other specs, including play and standby time, seem to be identical. None of Aukey’s devices offer dual RCA jacks as an output option; they’re 3.5 mm TRS-only. However, as my teardown writeup notes, the inclusion of a TRS-to-dual RCA adapter cable in each product’s kit makes multiple integrated output options a seemingly unnecessary functional redundancy.

As time passed, my memory of the specifics of that latter piece admittedly faded, although I’ve re-quoted the following excerpt a few times in comparing a key point made then with other conceptually reminiscent product categories: LED light bulbs, LCDs, and USB-C-based devices:

Such diversity within what’s seemingly a mature and “vanilla” product category is what prompted me to put cyber-pen to cyber-paper for this particular post. The surprising variety I encountered even during my brief period of research is reflective of the creativity inherent to you, the engineers who design these and countless other products. Kudos to you all!

Fast forward to early December 2023, when I saw an Aukey Bluetooth audio adapter intended specifically for in-vehicle use (therefore battery powered, and with an embedded microphone for hands-free telephony), although usable elsewhere too. It was advertised at bargains site SideDeal (a sibling site to same-company Meh, who I’ve also mentioned before) for $12.99.

No specific model number was documented on the promo page, only some features and specs:

Features

  • Wireless Audio Stream
    • The Bluetooth 5 receiver allows you to wirelessly stream audio from your Bluetooth enabled devices to your existing wired home or car stereo system, speakers, or headphones
  • Long Playtime
    • Built-in rechargeable battery supports 18 hours of continuous playback and 1000 hours of standby time
  • Dual Device Link
    • Connect two bluetooth devices simultaneously; free to enjoy music or answer phone call from either of the two paired devices
  • Easy Use
    • Navigate your music on the receiver with built-in controls which can also be used to manage hands-free calls or access voice assistant

 Specifications

  • Type: Receiver
  • Connectivity: 3.5mm
  • Bluetooth standard: Bluetooth v5.0
  • Color: Black
  • To fit: Audio Receivers
  • Ports: 3.5 mm Jack

I bit. I bought three, actually; one each for my and my wife’s vehicles, and a third for teardown purposes. When they arrived, I put the third boxed one on the shelf.

Fast forward nearly a year later, to the beginning of November 2024 (and a couple of weeks prior to when I’m writing these words now), when I pulled the box back off the shelf and prepared for dissection. I noticed the model number, BR-C1, stamped on the bottom of the box but didn’t think anything more of it until I remembered and re-read that blog post published almost exactly nine years earlier, which had mentioned the exact same device:

(I’ve saved you from the boring shots of the blank cardboard box sides)

Impressive product longevity, eh? Hold that thought. Let’s dive in:

The left half of the box contents comprises three cables: USB-A to micro-USB for recharging, plus 3.5 mm (aka, 1/8”) TRS to 3.5 mm, and 3.5 mm to dual RCA for audio output connections:

And a couple of pieces of documentation (a PDF of the user manual is available here):

On the right, of course, is our patient (my images, this time, versus the earlier stock photos), as usual accompanied by a 0.75″ (19.1 mm) diameter U.S. penny for size comparison purposes:

The other three device sides, like the earlier box sides, are bland, so I’ve not included images of them. You’re welcome.

Note, among other things, the FCC ID, 2AFHP-BR-C1. Again, hold that thought. By the way, it’s 2AFHP-BR-C1, not the 2AFHPBR-C1 stamped on the underside, which as it turns out is a different device, albeit, judging from the photos, also an automobile interior-tailored product.

From past experience, I’ve learned that the underside of a rubber “foot” is often a fruitful path inside a device, so once again I rolled the dice:

Bingo: my luck continues to hold out!

With all four screws removed (or at least sufficiently loosened; due to all that lingering adhesive, I couldn’t get two of them completely out of the holes), the bottom popped right off:

And the first thing I saw staring back at me was the 3.7-V, 300 mAh Li-polymer “pouch” cell. Why they went with this battery form factor and formulation versus the more common Li-ion “can” is unclear; there was plenty of room in the design for the battery, and flexibility wasn’t necessary:

In pulling the PCB out of the remaining top half of the case:

revealing, among other things, the electret microphone above it:

I inadvertently turned the device on, wherein it immediately went into blue-blinking-LED standby mode (I fortuitously quick-snapped the first still photo while the LED was illuminated; the video below it shows the full blink cadence):

Why standby, versus the initial alternating red/blue pairing-ready sequence that per the user manual (not to mention common sense) it was supposed to first-time power up in? I suspect that since this was a refurbished (not brand new) device, it had been previously paired to something by the prior owner and the factory didn’t fully reset it before shipping it back out to me. A long-press of the topside button got the device into the desired Bluetooth pairing mode:

And another long-press powered the PCB completely off again:

The previously seen bottom side of the PCB was bare (the glued-on battery doesn’t count, in my book) and, as usual for low cost, low profit margin consumer electronics devices like this one, the PCB topside isn’t very component-rich, either. In the upper right is the 3.5 mm audio output jack; to its left, and in the upper left, is the micro-USB charging connector, with the solder sites for the microphone wiring harness between them. Below them is the system’s multi-function power/mode switch. At left is the three-wire battery connector. Slightly below and to its right (and near the center) is the main system processor, Realtek’s RTL8763BFR Bluetooth dual mode audio SoC with integrated DAC, ADC (for the already-seen mic), DSP and both ROM and RAM.

To the right is of the Realtek RTL8763BFR is its companion 40 MHz oscillator, with a total of three multicolor LEDs in a column both above and below it. In contrast, you may have previously noted five light holes in the top of the device; the diffusion sticker in the earlier image of the inside of the top half of the chassis “bridges the gaps”. Below and to the left of the Realtek RTL8763BFR is the HT4832 audio power amplifier, which drives the aforementioned 3.5 mm audio output jack. The HT4832 comes from one of the most awesome-named companies I’ve yet come across: Jiaxing Heroic Electronic Technology. And at the bottom of the PCB, perhaps obviously, is the embedded Bluetooth antenna.

After putting the device back together, it seemingly still worked fine; here’s what the LEDs look like displaying the pairing cadence from the outside:

All in all, a seemingly straightforward teardown, right? So, then, what’s with the “Identity Deceiver” mention in this writeup’s title? Well, before finishing up, I as-usual hit up the FCC certification documentation, final-action dated January 29, 2018, to see if I’d overlooked anything notable…but the included photos showed a completely different device inside. This time, the bottom side of the PCB was covered with components. And one of them, the design’s area-dominant IC, was from ISSC Technologies, not Realtek. See for yourself.

Confused, I hit up Google to see if anyone else had done a teardown of the Aukey BR-C1. I found one, in video form, published on October 30, 2015. It shows the same design version as in the FCC documentation:

The Aukey BR-C1 product review from the same YouTube creator, published a week-plus earlier, is also worth a view, by the way:

Fortuitously, the YouTube “thumbnail” video for the first video showcases the previously mentioned ISSC Technologies chip:

It’s the IS1681S, a Bluetooth 3.0+EDR multimedia SOC. Here’s a datasheet. ISSC Technologies was acquired by Microchip Technology in mid-2014 and the IS1681S presumably was EOL’d sometime afterward, thereby prompting Aukey’s redesign around Realtek silicon. But how was Aukey able to take the redesign to production without seeking FCC recertification? I welcome insights on this, or anything else you found notable about this teardown, in the comments!

Brian Dipert is the Editor-in-Chief of the Edge AI and Vision Alliance, and a Senior Analyst at BDTI and Editor-in-Chief of InsideDSP, the company’s online newsletter.

Related Content

googletag.cmd.push(function() { googletag.display('div-gpt-ad-native'); }); -->

The post A Bluetooth receiver, an identity deceiver appeared first on EDN.

Software-defined vehicle (SDV): A technology to watch in 2025

EDN Network - Mon, 12/30/2024 - 16:58

Software-defined vehicle (SDV) technology has been a prominent highlight in the quickly evolving automotive industry. But how much of it is hype, and where is the real and tangible value? CES 2025 in Las Vegas will be an important venue to gauge the actual progress this technology has made with a motto of bringing code on the road.

Elektrobit will demonstrate its cloud-based virtual development, prototyping, testing, and validation platform for digital cockpits and in-vehicle infotainment (IVI) at the show. The company’s SDV solutions encompass AMD’s automotive-grade hardware, Google’s Android Automotive and Gemini AI, Epic Games’ Unreal Engine for 3D rendering, and Here navigation.

Figure 1 SDV is promising future-proof cockpit agnostic of hardware and software. Source: Elektrobit

Moreover, at CES 2025, Sony Honda Mobility will showcase its AFEELA prototype for electric vehicles (EVs), which employs Elektrobit’s digital cockpit built around a software-defined approach. Elektrobit’s other partners demonstrating their SDV solutions at the show include AWS, Cognizant, dSPACE, Siemens, and Sonatus.

SDV’s 2024 diary

Earlier, in April 2024, leading automotive chipmaker Infineon joined hands with embedded software specialist Green Hills to jointly develop SDV architectures for EV drivetrains. Infineon would combine its microcontroller-based processing platform AURIX TC4x with safety-certified real-time operating system (RTOS) µ-velOSity from Green Hills.

Figure 2 Real-time automotive systems are crucial in SDV architectures. Source: Infineon Technologies

Green Hills has already ported its µ-velOSity RTOS to the AURIX TC4x microcontrollers. The outcome of this collaboration will be safety-critical real-time automotive systems capable of serving SDV designs and features.

Next, Siemens EDA has partnered with Arm and AWS to accelerate the creation of virtual cars in the cloud. The toolmaker has announced the availability of its PAVE360-based solution for automotive digital twin on AWS cloud services.

Figure 3 The digital twin solution on the AWS platform aims to create a virtual car in the cloud. Source: Siemens EDA

“The automotive industry is facing disruption from multiple directions, but the greatest potential for growth and new revenue streams is the adoption of the software-defined vehicle,” said Mike Ellow, executive VP of EDA Global Sales, Services and Customer Support at Siemens Digital Industries Software. “The hyper-competitive SDV industry is under immense pressure to quickly react to consumer expectations for new features.”

That’s driving the co-development of parallel hardware and software and the move toward the holistic digital twin, he added. Dipti Vachani, senior VP and GM of Automotive Line of Business at Arm, went a step ahead by saying that the software-defined vehicle is survival for the automotive industry.

Hype or reality

The above recap of 2024 activities shows that a lot is happening in the SDV design space. A recent IDTechEx report titled “Software-Defined Vehicles, Connected Cars, and AI in Cars 2024-2034: Markets, Trends, and Forecasts” claims that the cellular connectivity within SDVs can provide access to Internet of Things (IoT) features such as over-the-air (OTA) updates, personalization, and entertainment options.

It also explains how artificial intelligence (AI) within an SDV solution can work as a digital assistant to communicate and respond to the driver and make interaction more engaging using AI-based visual characters appearing on the dashboard. BMW is already offering a selection of SDV features, including driving assistants and traffic camera information.

Figure 4 SDV is promising new revenue streams for car OEMs. Source: IDTechEx

At CES 2025, automotive OEMs, Tier 1’s, chip vendors, and software suppliers are expected to present their technology roadmaps for SDV products. This will offer good visibility on how ready the present SDV technology is for the cars of today and tomorrow.

Related Content

googletag.cmd.push(function() { googletag.display('div-gpt-ad-inread'); });
googletag.cmd.push(function() { googletag.display('div-gpt-ad-native'); }); -->

The post Software-defined vehicle (SDV): A technology to watch in 2025 appeared first on EDN.

Green Semiconductors: Balancing Performance and Sustainability

ELE Times - Mon, 12/30/2024 - 14:25

In today’s rapidly evolving technology landscape, semiconductors play a central role in powering a wide range of devices, from smartphones and computers to cars and industrial systems. As the demand for more advanced, faster, and efficient electronic devices grows, there is an increasing push to develop semiconductors that not only offer high performance but are also environmentally sustainable. This shift toward “green semiconductors” is driven by the growing need to balance technological progress with the imperative to address climate change and reduce environmental impact.

What Are Green Semiconductors?

Green semiconductors are materials and technologies that prioritize energy efficiency, sustainability, and reduced environmental impact throughout their life cycle—from manufacturing to disposal. These semiconductors are designed with the intent to minimize the carbon footprint, energy consumption, and material waste, all while maintaining or improving their performance. They represent an intersection between cutting-edge technology and environmental responsibility, marking a significant step forward in the quest for sustainable innovation.

The Rising Demand for Sustainable Electronics

The global electronics industry is undergoing a transformation driven by the need for more sustainable solutions. According to a report from the International Energy Agency (IEA), the energy consumption of the global electronics sector is expected to increase by 4-6% annually in the coming decades. This growing demand for electronics places a significant burden on power grids and intensifies the need for efficient energy use. Additionally, the production and disposal of electronic devices are major contributors to environmental pollution, from the mining of rare earth metals to the disposal of electronic waste.

With these factors in mind, semiconductor manufacturers are being called upon to innovate in ways that will mitigate the environmental impact of their products. While the semiconductor industry is responsible for producing the components essential to nearly every modern technological advancement, it is also one of the most energy-intensive industries in the world, requiring high amounts of power and raw materials.

Key Aspects of Green Semiconductor Technologies

Several technological approaches are being pursued to create greener semiconductors. These include innovations in materials, design, manufacturing processes, and end-of-life disposal. The following are some of the key aspects of green semiconductor technologies:

  1. Energy-Efficient Materials
    Traditional semiconductors, such as silicon, have been the cornerstone of the industry for decades. However, the growing demand for faster processing speeds and lower energy consumption has spurred the development of alternative materials. Gallium nitride (GaN) and silicon carbide (SiC) are two examples of materials gaining traction in power electronics and high-performance computing. These materials offer improved efficiency and performance compared to traditional silicon chips. They can handle higher voltages, frequencies, and temperatures, leading to more efficient energy conversion and less heat generation. For instance, GaN semiconductors are used in electric vehicle charging stations, where high efficiency and fast charging are crucial.
  2. Low-Power Semiconductors
    A key component of green semiconductors is their ability to operate at lower power. The transition from larger, power-hungry devices to low-power alternatives has been an important focus for the industry. For example, processors designed for mobile devices or edge AI systems are built with an emphasis on reducing power consumption while maintaining high processing capabilities. Low-power semiconductors are essential in consumer electronics such as smartphones, wearables, and home automation systems, where prolonged battery life is a critical performance factor. Companies like ARM are developing more energy-efficient chip architectures, making them ideal for green semiconductor solutions.
  3. Recyclability and Sustainable Manufacturing
    The manufacturing process for semiconductors can be resource-intensive and harmful to the environment. Traditional semiconductor manufacturing involves toxic chemicals, energy-intensive fabrication processes, and non-recyclable materials. As a result, companies are exploring sustainable practices to reduce waste and energy consumption. One such method is the use of recyclable materials for chip components, such as recyclable plastics for packaging and the use of more environmentally friendly chemicals in the fabrication process. Additionally, advancements in additive manufacturing (3D printing) are allowing for more precise and efficient production, which reduces material waste and energy consumption.
  4. Advanced Packaging Techniques
    Semiconductor packaging refers to the physical casing that holds a semiconductor chip and connects it to the external circuits. Traditional packaging materials and processes can contribute significantly to waste and environmental harm. New, more sustainable packaging solutions are being developed to reduce these impacts. For example, techniques like system-in-package (SiP) and chip-on-board (COB) enable more compact and efficient designs, which reduce the need for multiple components and lower overall energy consumption. These innovations also make it easier to recycle semiconductor devices at the end of their life.
  5. AI and Machine Learning for Optimization
    Artificial intelligence (AI) and machine learning (ML) can play a crucial role in optimizing semiconductor designs and manufacturing processes. By utilizing AI algorithms, manufacturers can predict and control energy consumption in real-time, minimize material waste, and optimize production efficiency. AI-driven techniques can also be used to create smarter semiconductors capable of learning from their environment and adjusting their operation to maximize energy efficiency without sacrificing performance.

The Role of Green Semiconductors in Key Industries

Green semiconductors are essential across a variety of sectors, contributing to the development of more sustainable products and processes.

  1. Automotive Industry
    The rise of electric vehicles (EVs) has significantly increased the demand for efficient power electronics, where green semiconductors are playing a key role. For instance, power semiconductors made from silicon carbide are crucial in EV charging systems, where they help reduce energy loss and enhance the overall efficiency of electric power conversion. These semiconductors are also used in motor control, onboard energy management, and regenerative braking systems in EVs, helping to maximize the vehicle’s overall energy efficiency.
  2. Renewable Energy
    Semiconductors are central to the functioning of renewable energy systems such as solar panels and wind turbines. Green semiconductors contribute by enabling better power conversion and distribution in solar inverters and wind turbine generators. Power semiconductors that use wide-bandgap materials like GaN and SiC can help maximize energy harvest while minimizing energy loss. This makes renewable energy systems more efficient and cost-effective, promoting a transition to cleaner energy sources.
  3. Healthcare
    Healthcare products, particularly wearables and medical devices, require semiconductors that are both energy-efficient and precise. In healthcare, green semiconductors are used to power sensors, diagnostic equipment, and monitoring systems, where low power consumption and longevity are critical. Innovations like flexible and biocompatible semiconductor devices are enabling breakthroughs in medical monitoring and diagnostics, offering more sustainable healthcare solutions.
  4. Data Centers and Cloud Computing
    Data centers are known for their high energy consumption. As the demand for cloud services grows, energy efficiency has become a major priority for data center operators. Green semiconductors can help reduce the energy consumption of servers, storage devices, and networking components. Low-power processors, optimized circuit designs, and efficient memory systems are essential in making cloud computing infrastructure more sustainable, reducing its environmental impact.

Overcoming the Challenges

While green semiconductors offer tremendous promise, their development is not without challenges. For one, the research and development of alternative semiconductor materials like GaN and SiC require significant investment, as these materials are often more expensive and less mature than traditional silicon. Moreover, the manufacturing processes for these advanced materials can be complex and costly. Additionally, there is a need for standardization in the production of green semiconductors to ensure they meet the necessary performance and environmental standards.

Conclusion

The emergence of green semiconductors is a crucial step toward balancing technological innovation with environmental sustainability. By focusing on energy-efficient materials, low-power devices, and sustainable manufacturing processes, the semiconductor industry is laying the groundwork for a more sustainable and responsible future. As demand for semiconductors continues to rise in sectors like automotive, healthcare, and renewable energy, green semiconductors will play a key role in powering the future while minimizing the environmental impact. Achieving this balance between performance and sustainability will require continued innovation and collaboration across the industry, but the rewards—both for the environment and for society—will be well worth the effort.

The post Green Semiconductors: Balancing Performance and Sustainability appeared first on ELE Times.

Soldering Meaning, Types, Process, Working, Uses and Machine

ELE Times - Mon, 12/30/2024 - 14:06

Soldering is a process used to join two or more metal components by melting and flowing a filler metal, known as solder, into the joint. The filler metal has a lower melting point than the workpieces, ensuring that the base materials do not melt during the process. Soldering is widely used in electronics, plumbing, jewellery making, and metalwork due to its ability to create reliable and conductive joints.

Soldering History

The history of soldering dates back thousands of years to ancient civilizations. The earliest evidence of soldering was found in Mesopotamia around 3000 BCE, where goldsmiths used soldering to join gold pieces. Ancient Egyptians and Romans also used soldering techniques for jewellery and weaponry. By the Middle Ages, soldering became essential in stained glass art and decorative metalwork. The industrial revolution in the 18th and 19th centuries saw significant advancements in soldering tools and materials, making it integral to electrical and mechanical applications. Today, soldering remains a cornerstone in modern manufacturing and repair processes.

Types of Soldering

Soldering techniques are categorized based on the temperature and materials involved:

  1. Soft Soldering:
    • Operates at temperatures below 400°C (752°F).
    • Commonly used in electronics and plumbing.
    • Utilizes tin-lead or lead-free alloys.
  2. Hard Soldering:
    • Involves higher temperatures and stronger joints.
    • Includes techniques like silver soldering.
    • Used in jewellery, metalwork, and mechanical assemblies.
  3. Brazing:
    • Often considered a high-temperature form of soldering.
    • Filler metals like brass or silver are melted above 450°C (842°F).
    • Suitable for heavy-duty applications, including aerospace and automotive industries.
  4. Wave Soldering:
    • Used in mass production of printed circuit boards (PCBs).
    • Components are soldered simultaneously by passing them over a wave of molten solder.
  5. Reflow Soldering:
    • Involves applying solder paste and heating it to attach electronic components.
    • Widely used in surface-mount technology (SMT).
How Does Soldering Work?

Soldering works by creating a metallurgical bond between the solder and the base materials. The process involves:

  1. Preparation: The surfaces to be joined are cleaned to remove oxidation, dirt, and grease.
  2. Flux Application: Flux is applied to prevent oxidation during heating and to improve solder flow.
  3. Heating: A soldering iron or other heat source heats the joint, melting the solder.
  4. Bond Formation: The molten solder flows into the joint via capillary action and solidifies, forming a strong, conductive bond.
The Soldering Process
  1. Clean the Components: Ensure the surfaces are free of contaminants for a strong bond.
  2. Apply Flux: Spread flux on the joint area to enhance adhesion and prevent oxidation.
  3. Heat the Joint: Use a soldering iron to heat the connection point, not the solder directly.
  4. Apply Solder: Feed the solder wire into the heated joint, allowing it to flow naturally.
  5. Inspect the Joint: Check for a shiny and smooth appearance, indicating a successful bond.
  6. Clean the Joint: Remove any residual flux or debris for a neat finish.
Soldering Uses & Applications

Soldering is employed across various industries for diverse applications:

  1. Electronics:
    • Assembling circuit boards.
    • Repairing electronic devices like smartphones, TVs, and laptops.
  2. Plumbing:
    • Joining copper pipes for water supply and HVAC systems.
  3. Jewellery Making:
    • Creating intricate designs and securing precious stones.
  4. Automotive:
    • Connecting wiring harnesses and electronic components in vehicles.
  5. Art and Craft:
    • Stained glass creation and decorative metal projects.
  6. Aerospace and Defense:
    • Ensuring reliable connections in high-performance environments.
Soldering Advantages
  1. Strong Joints: Produces durable connections capable of withstanding mechanical stress.
  2. Electrical Conductivity: Ensures reliable electrical connections in circuits.
  3. Versatility: Suitable for a wide range of materials and industries.
  4. Cost-Effective: Requires relatively inexpensive tools and materials.
  5. Repairability: Allows for easy rework and repairs of damaged joints.
  6. Precision: Enables intricate and delicate work, especially in electronics and jewelry.
Soldering Machine

Soldering machines are automated or semi-automated tools designed to enhance the soldering process. They are used for efficiency and precision in industrial applications. Common types include:

  1. Soldering Irons: Handheld tools with a heated tip for manual soldering.
  2. Soldering Stations: Advanced setups with adjustable temperature controls and interchangeable tips.
  3. Wave Soldering Machines: Automate the soldering of components on PCBs for high-volume production.
  4. Reflow Soldering Ovens: Heat solder paste to attach surface-mounted components in electronic assemblies.
  5. Robotic Soldering Machines: Use programmed movements for consistent and precise soldering in manufacturing.
Conclusion

Soldering is a fundamental technique that has evolved significantly over centuries, finding applications across industries due to its reliability and efficiency. From ancient goldsmiths to modern electronics, soldering continues to enable the creation and repair of essential components in our daily lives. With advancements in soldering tools and machines, it remains a vital process in manufacturing, art, and engineering, driving innovation and connectivity worldwide.

The post Soldering Meaning, Types, Process, Working, Uses and Machine appeared first on ELE Times.

IoT Smart Lighting System, Types, Technology, Products and Benefits

ELE Times - Mon, 12/30/2024 - 13:05

IoT (Internet of Things) smart lighting refers to a technology-driven lighting system that integrates traditional lighting with IoT capabilities, allowing for advanced features such as remote control, automation, energy efficiency, and personalized user experiences. These systems are connected to the internet and can be managed via smartphones, voice assistants, or central hubs. They often incorporate sensors and advanced algorithms to adjust lighting conditions based on environmental and user preferences.

What is an IoT Lighting System?

An IoT lighting system is a network of interconnected smart lighting devices designed to operate collaboratively through internet connectivity. These systems include components such as smart bulbs, luminaires, motion sensors, and control units, all communicating with each other through protocols like Wi-Fi, Zigbee, or Bluetooth. IoT lighting systems can be part of larger smart home or smart building solutions, enabling seamless integration with other IoT devices like thermostats, security cameras, or HVAC systems.

Types of IoT Smart Lighting

IoT smart lighting solutions come in various types, tailored to different applications and needs:

  1. Smart Bulbs: Individual bulbs that can change color, intensity, and schedules via apps or voice assistants.
    • Examples: Philips Hue, Wyze Bulb.
  2. Smart Light Strips: Flexible lighting strips for decorative purposes, often used in architectural or ambient lighting.
    • Examples: LIFX Z, Govee LED Strips.
  3. Smart Outdoor Lighting: Weather-resistant lighting solutions for gardens, pathways, or security purposes.
    • Examples: Ring Smart Lighting, Philips Hue Outdoor.
  4. Connected Ceiling Fixtures: Entire luminaires with built-in IoT features for homes or offices.
    • Examples: GE Cync Smart Ceiling Fixtures.
  5. Industrial and Commercial IoT Lighting: Large-scale lighting solutions for warehouses, factories, and office buildings, incorporating energy optimization and centralized control.
    • Examples: Current by GE, SmartCast by Cree Lighting.
IoT Smart Lighting Technology

IoT smart lighting relies on several key technologies to function effectively:

  1. Wireless Communication Protocols:
    • Wi-Fi: Offers direct connectivity but may consume more power.
    • Zigbee: Low-power, mesh networking for reliable communication.
    • Bluetooth Low Energy (BLE): Energy-efficient and suitable for localized controls.
  2. Sensors:
    • Motion Sensors: Detect movement to activate or dim lights.
    • Ambient Light Sensors: Adjust brightness based on surrounding light levels.
    • Presence Sensors: Differentiate between occupied and unoccupied spaces.
  3. Cloud Computing: Enables remote access, data storage, and processing for features like predictive maintenance and advanced analytics.
  4. Edge Computing: Processes data locally for real-time adjustments, reducing latency and dependence on cloud services.
  5. Integration with AI and Machine Learning: Personalizes lighting based on learned user habits and preferences.
Popular IoT Smart Lighting Products
  1. Philips Hue: A comprehensive smart lighting ecosystem including bulbs, light strips, and outdoor lights.
  2. LIFX Smart Bulbs: Known for vibrant colors and Wi-Fi connectivity without the need for a hub.
  3. Wyze Bulb: Affordable smart bulbs offering voice and app controls.
  4. Ring Smart Lighting: Focused on outdoor and security lighting solutions.
  5. SmartCast by Cree Lighting: Advanced solutions for commercial and industrial applications.
Benefits of IoT Smart Lighting

IoT smart lighting provides numerous benefits for households, businesses, and cities, making it a transformative technology for modern living and operations:

  1. Energy Efficiency:
    • Automatically adjusts lighting based on natural light availability or room occupancy, significantly reducing energy consumption.
    • LED technology combined with smart features leads to lower electricity bills.
  2. Convenience and Automation:
    • Allows remote control via apps or voice commands, eliminating the need to physically interact with switches.
    • Supports customizable schedules and routines to match daily habits.
  3. Enhanced Security:
    • Motion-activated outdoor lights deter intruders.
    • Lighting schedules can mimic human activity when occupants are away, enhancing home security.
  4. Improved Mood and Productivity:
    • Dynamic lighting options like warm tones for relaxation and bright white light for focus contribute to well-being.
    • Suitable for circadian rhythm lighting, which aligns with natural daylight patterns to promote better sleep and energy levels.
  5. Scalability and Flexibility:
    • Easy to add or replace components without significant infrastructure changes.
    • Adaptable for diverse environments, from small homes to large commercial buildings.
  6. Cost Savings in Maintenance:
    • Predictive analytics notify users of potential failures, enabling timely replacements and reducing downtime.
  7. Sustainability:
    • Promotes eco-friendly practices through reduced energy use and longer lifespans of LED products.
Conclusion

IoT smart lighting represents a significant leap forward in lighting technology, combining energy efficiency, automation, and personalization to enhance living and working environments. With continuous advancements in IoT and AI, these systems are becoming increasingly sophisticated, accessible, and essential in achieving sustainability and convenience. Whether for homes, businesses, or cities, IoT smart lighting is paving the way for a brighter, smarter future.

 

The post IoT Smart Lighting System, Types, Technology, Products and Benefits appeared first on ELE Times.

Pages

Subscribe to Кафедра Електронної Інженерії aggregator