Новини світу мікро- та наноелектроніки

IQE to add 109 jobs and invest $305m expanding US operation in Greensboro

Semiconductor today - Втр, 11/05/2024 - 17:12
Epiwafer and substrate maker IQE plc of Cardiff, Wales, UK has announced an expansion of its manufacturing facility in Greensboro, Guilford County, North Carolina, subject to customer commitments and funding from the US federal CHIPS and Science Act. The firm plans to add 109 jobs and invest $305m over several years...

My first appliance repair

Reddit:Electronics - Втр, 11/05/2024 - 17:02
My first appliance repair

Long time lurker. I consider myself reasonably handy but this was the first time working on an appliance. Grabbed this microwave for $50 on Facebook marketplace 6 months ago. Friday it did the whirlpool hum of death. Unsure if it was the diode, capacitor or magnetron I replaced them all. Got all components off Amazon and replacement took 1.5 hours from taking it down to putting it back up. Now I’m on Facebook marketplace looking for “broken” appliances I can fix and flip haha. Thanks for this sub for giving me the confidence to do this!

submitted by /u/Boston__Massacre
[link] [comments]

Smart TV power-ON aid

EDN Network - Втр, 11/05/2024 - 16:59

Editor’s note: This design idea offers a solution to the common issue of a TV automatically restarting after a power outage. Oftentimes, power may be restored when the consumer is not present and unknowingly left running. This could be due to several reasons, including the HDMI-CEC settings on the TV or simply an automatic restore factory setting. While it a useful setting to have enabled, it would be helpful to ensure the TV will not be automatically turned on when power is restored after a power outage.

Introduction

Present day TV designers take ample care in power supply design such that TV comes “ON” automatically after a power shut down and resumption, if TV was “ON” before power shut down. If the TV was “OFF” before power shut down, it continues to be “OFF”, even after power resumption. This is an excellent feature; one can continue to watch TV after a brief shut down and resumption without any manual intervention.

At times, this can lead to certain inconveniences in case of long power shutdowns. The last time this happened to us, we were watching TV, and the power suddenly went off. At some point during this power outage, we had to leave and came back home after two days. The power may have resumed a few hours after we left. However, as per its design, the TV turned “ON” automatically and was “ON” for two days. This caused discomfort to neighbors until we returned and switched the TV off. What a disturbance to others!

Wow the engineering world with your unique design: Design Ideas Submission Guide

TV power-ON aid

I designed the “TV Power-ON aid” gadget in Figure 1 to overcome this problem. Mains power is fed to this gadget. Power is fed to the TV through this gadget. Once the SW1 switch/button is pushed, the TV receives power, as long as mains power is there. If power goes “OFF” and resumes within say, a half hour, the TV will receive power from the mains without any manual intervention, like the original design. If the power resumes after a half hour, where it is likely you may not be near the TV at that time, the power will not be extended to TV automatically. Instead, you will have to push the button SW1 once to feed power to TV. This gadget saves us from causing discomfort to the neighbors from an unattended TV blasting shows: a problem anybody can face during a long power outage when he/she was not present in the house.

Figure 1 TV power-ON aid circuit. Connect mains power to J1. Take power to TV from J2. Connect power supply pins of U2, U3, and U4 to V1. The grounds of these ICs must be connected to the minus side of the battery. These connections are not shown in the image.

Circuit description

The first time, you will have to press momentary push button SW1 once. Relay RL2 gets energized and its “NO” contact closes, shorting SW1. Hence, the relay continues to be “ON” and power is extended to TV.

When mains power goes off, RL2 relay de-energizes. Through the “NC” contact of relay RL2, the battery (3X 1.5 V alkaline batteries) become connected to the OFF-delay timer circuit formed by U1(555), U2 (4011), U3 (4020), and U4(4017). As soon as the battery gets connected, this circuit switches “ON” the relay RL1 through MOSFET Q1 (IRLZ44N). Its “NO” contact closes and shorts SW1.

The timer circuit holds this relay for approximately a half hour. (The time can be adjusted by suitable selection of C2). If power resumes in this half an hour period, since SW1 is shorted by RL1contact, the power gets fed to TV automatically. If the power resumes after a half hour, since RL1 gets de-energized due to the OFF-delay timer action, its contact which is connected across SW1, is opened and power is not extended to TV. This is a safe condition. When you come back, you can push the button SW1 to feed power to TV. The RL1 coil voltage is 5 V and the voltage of RL2 is either 230 V AC or 110 V as needed.

The U1 circuit works as an oscillator. The U3 circuit works as frequency divider. This frequency is counted by U4 circuit. When time delay reaches around 30 minutes, its Q9 goes high. Hence U2C output goes “LO” and RL1 gets de-energized. Whenever power goes off, timer circuit gets battery voltage through “NC” contact of RL2. When power resumes, battery is disconnected from timer circuit, thus saving battery power. A Lithium-ion battery and charger circuit can be added in place of alkaline batteries, if desired.

Jayapal Ramalingam has over three decades of experience in designing electronics systems for power & process industries and is presently a freelance automation consultant.

Related Content

googletag.cmd.push(function() { googletag.display('div-gpt-ad-native'); }); -->

The post Smart TV power-ON aid appeared first on EDN.

How to Control NeoPixel Installations via Wi-Fi Using Fishino and NodeMCU with Python

Open Electronics - Втр, 11/05/2024 - 15:51

We create NeoPixel light installations with Fishino and NodeMCU controlled via Wi-Fi from a PC or Raspberry Pi using a Python library. A few years ago, the American company Adafruit Industries transformed the LED market by introducing NeoPixels, RGB LEDs that incorporate their own controller in a single package. Adafruit aimed to simplify LED management […]

The post How to Control NeoPixel Installations via Wi-Fi Using Fishino and NodeMCU with Python appeared first on Open Electronics. The author is Boris Landoni

Power Integrations launches 1700V GaN switcher IC

Semiconductor today - Втр, 11/05/2024 - 14:36
Power Integrations Inc of San Jose, CA, USA (which provides high-voltage integrated circuits for energy-efficient power conversion) has added to its InnoMux-2 family of single-stage, independently regulated multi-output offline power supply ICs by introduced a new device featuring what is claimed to be the first 1700V gallium nitride (GaN) switch, fabricated using the company’s proprietary PowiGaN technology. The 1700V rating is said to further advances the state-of-the-art for GaN power devices, previously set by Power Integrations’ own 900V and 1250V devices, both launched in 2023...

Lumentum releases 2024 Corporate Sustainability Report

Semiconductor today - Втр, 11/05/2024 - 10:50
Lumentum Holdings Inc of San Jose, CA, USA (which designs and makes optical and photonic products for optical networks and lasers for industrial and consumer markets) has issued its fourth annual Corporate Sustainability Report, for fiscal 2024 (ended 29 June), showcasing its environmental, social and governance achievements...

My first RLC circuit as a end of semester project (circuits 1)

Reddit:Electronics - Втр, 11/05/2024 - 05:54
My first RLC circuit as a end of semester project (circuits 1)

Just wanted to share my first dc circuit, it is for my first circuits class, as a end of semester project.

It consist in just a relay, a 1000uf capacitor and a 330 ohm resistor.

It still is not the proper or final circuit, as I'm using a 5v relay with 12v as power source. Also it is conected in the "inverse way" as the leds need to imitate signal lights from a car.

Tomorrow I will buy the correct parts but for the moment I'm happy to see that this works and I can do the math to understand it !

submitted by /u/InsectOk8268
[link] [comments]

Infineon launches CoolGaN Transistors 650V G5 product family

Semiconductor today - Пн, 11/04/2024 - 21:10
Infineon Technologies AG of Munich, Germany has strengthened its gallium nitride (GaN) portfolio by launching a new family of high-voltage discretes, the CoolGaN Transistors 650V G5. Target applications range from consumer and industrial switched-mode power supplies (SMPS) such as USB-C adapters and chargers, lighting, TV, data-center and telecom rectifiers to renewable energy and motor drives in home appliances...

MACOM to lead US CHIPS Act-funded GaN-on-SiC technology development project

Semiconductor today - Пн, 11/04/2024 - 20:07
MACOM Technology Solutions Inc of Lowell, MA, USA (which designs and makes RF, microwave, analog and mixed-signal and optical semiconductor technologies) has been selected to lead a development project to establish gallium nitride (GaN) on silicon carbide (SiC) process technologies for radio frequency (RF) and microwave applications. Funded by the CHIPS and Science Act through the US Department of Defense (DoD), the project will focus on developing semiconductor manufacturing processes for GaN-based materials and monolithic microwave integrated circuits (MMICs) operating efficiently at high voltage and at millimeter-wave (mmW) frequencies...

Apple’s fall 2024 announcements: SoC and memory upgrade abundance

EDN Network - Пн, 11/04/2024 - 17:33

Two years ago, Apple skipped its typical second second-half-of-year event, focusing solely on a September 2022 unveil of new iPhones, smartwatches, and earbuds. Last year I thought the company might pull the same disappearing act, but it ended up rolling out the three-member M3 SoC family, along with an M3-based subset of its systems suite. And this year? Mid-September predictably brought us new iPhones, smartwatches, and earbuds. But as October’s end drew near with nothing but silence from Cupertino (tempting leaks from Russian vloggers aside), I wondered if this year would be a repeat of the prior or a reversion to 2022 form.

Turns out, October 2024 ended up being a conceptual repeat of 2023 after all…well, sort of. Apple waited until last Thursday (as I write these words on Halloween) to cryptically “tweet” (or is that “X”?) an “exciting week of announcements ahead, starting on Monday morning”. And indeed, the company once again unveiled new SoCs and systems (plus software updates) this year. But the news virtually (of course) dribbled out over three days this time, versus dropping at one big (online) event. That said, there were in-depth (albeit, as usual, with a dallop of hype) videos accompanying many of the releases. Without further ado, and in chronological order:

The iPad mini 7

The first unveil in the October 2024 new-product sequence actually happened two weeks ago, when Apple rolled out its latest-generation tiny tablet. That the iPad mini 7 would sooner-or-later appear wasn’t a surprise, although I suppose Apple could have flat-out killed the entire iPad mini line instead, as it’s already done with the iPhone mini. The iPad mini 6 (an example of which I own) is more than four years old at this point, as I’d mentioned back in May. And specifically, it’s based on the archaic A15 Bionic SoC and only embeds 4 GBytes of RAM, both of which are showstoppers to the company’s widespread Apple Intelligence support aspirations.

SoC update (to the A17 Pro) and memory update (to 8 GBytes, reflective of deep learning model storage requirements) aside, the iPad mini 7 pretty much mirrors its predecessor, although it does now also support the Apple Pencil, and the USB-C bandwidth has been boosted to 10 Gbps. Claimed improvements to the “jelly scrolling” display behavior seen by some iPad mini 6 users (truth be told, I never noticed it, although I mostly hold mine in landscape orientation) are muddled by iFixit’s teardown, which suggests the display controller location is unchanged.

And by the way, don’t be overly impressed with the “Pro” qualifier in the SoC’s moniker. That’s the only version of the A17 that’s been announced to date, after all. And even though it’s named the same as the SoC in the iPhone 15 Pros, it’s actually a defeatured variant, with one less (5, to be precise) GPU core, presumably for yield maximization reasons.

O/S updates

Speaking of Apple Intelligence, on Monday the company rolled out “.1” updates to all of its devices’ operating systems, among other things adding initial “baby step” AI enhancements. That said, Europe users can’t access them at the moment, presumably due to the European Union’s privacy and other concerns, which the company hopes to have resolved by next April. And for the rest of us, “.2” versions with even more enabled AI capabilities are scheduled for release this December.

A couple of specific notes on MacOS: first off, in addition to iterating its latest-generation MacOS 15 “Sequoia” O/S, Apple has followed longstanding extended-support tradition by also releasing patches for the most recent two prior-generation operating system major versions, MacOS 13 (“Ventura”) and MacOS 14 (“Sonoma”). And in my case, that’s a great thing, because it turns out I won’t be updating to “Sequoia” any time soon, at least on my internal storage capacity-constrained Mac mini. I’d been excited when I read that MacOS 15.1 betas were enabling the ability to selectively download and install App Store software either to internal storage (as before) or to an external drive (as exists in my particular setup situation).

But as it turns out, that ability is only enabled for App Store-sourced programs 1 GByte or larger in size, which is only relevant to one app I use (Skylum’s Luminar AI which, I learned in the process of writing this piece, has been superseded by Luminar Neo anyway). Plus, the MacOS “Sequoia” upgrade from “Sonoma” is ~12 GBytes, and Apple historically requires twice that available spare capacity before it allows an update attempt to proceed (right now I have not 25 GBytes, but only 7.5 GBytes, free on the internal SSD). And the Apple Intelligence capabilities aren’t relevant to Intel-based systems, anyway. So…nah, at least for now.

By the way, before proceeding with your reading of my piece, I encourage you to watch at least the second promo video above from Apple, followed by the perusal of an analysis (or, if you prefer, take-down) of it by the always hilarious (and, I might, add, courageous) Paul Kafasis, co-founder and CEO of longstanding Apple developer Rogue Amoeba Software, whose excellent audio and other applications I’ve mentioned many times before.

The 24” iMac

Apple rolled out some upgraded hardware on Monday, too. The company’s M1-based 24” iMac, unveiled in April 2021, was one of its first Apple Silicon-based systems. Apple then skipped the M2 SoC generation for this particular computing platform, holding out until last November (~2.5 years later), when the M3 successor finally appeared. But it appears that the company’s now picking up the pace, since the M4 version just showed up, less than a year after that. This is also the first M4-based computer from Apple, following in the footsteps of the iPad Pro tablet-based premier M4 hardware surprisingly (at least to me) released in early May. That said, as with the defeatured A15 Bionic in the iPad mini 7 mentioned earlier in this writeup, the iMac’s M4 is also “binned”, with only eight-core CPU and GPU clusters in the “base” version, versus the 9- or 10-core CPU and 10-core GPU in the iPad Pros and other systems to come that I’ll mention next.

The M4 24” iMac comes first-time standard with 16 GBytes of base RAM (to my earlier note about the iPad mini’s AI-driven memory capacity update…and as a foreshadow, this won’t be the last time in this coverage that you encounter it!), and (also first-time) offers a nano-texture glass display option. Akin to the Lightning-to-USB-C updates that Apple made to its headphones back in mid-September, the company’s computer peripherals (mouse, keyboard and trackpad) now instead recharge over USB-C, too. The front camera is Center Stage-enhanced this time. And commensurate with the SoC update, the Thunderbolt ports are now gen4-supportive.

The Mac mini and M4 Pro SoC

Tuesday brought a more radical evolution. The latest iteration of the Mac mini is now shaped like a shrunk-down Mac Studio or, if you prefer, a somewhat bigger spin on the Apple TV. The linear dimensions and overall volume are notably altered versus its 2023 precursor, from:

  • Height: 1.41 inches (3.58 cm)
  • Width: 7.75 inches (19.70 cm)
  • Depth: 7.75 inches (19.70 cm)
  • Volume: 84.7 in3 (1,389.4 cm3)

to:

  • Height: 2.0 inches (5.0 cm)
  • Width: 5.0 inches (12.7 cm)
  • Depth: 5.0 inches (12.7 cm)
  • Volume: 50 in3 (806.5 cm3)

Said another way, the “footprint” area is less than half of what it was before, at the tradeoff of nearly 50% increased height. And the weight loss is notably too, from 2.6 pounds (1.18 kg) or 2.8 pounds (1.28 kg) before to 1.5 pounds (0.67 kg) or 1.6 pounds (0.73 kg) now. I also should note that, despite these size and weight decreases, the AC/DC conversion circuitry is still 100% within the computer; Apple hasn’t pulled the “trick” of moving it to a standalone PSU outside. That said, legacy-dimension “stacked” peripherals won’t work anymore:

And the new underside location of the power button is, in a word and IMHO, “weird”.

The two “or” qualifiers in the earlier weight-comparison sentence beg for clarification which will simultaneously be silicon-enlightening. Akin to the earlier iMac conversation, there’s been a SoC generation skip, from the M2 straight to the M4. The early-2023 version of the Mac mini came in both M2 and M2 Pro (which I own) SoC variants. Similarly, while this year’s baseline Mac mini is powered by the M4 (in this case the full 10 CPU/10 GPU core “spin”), a high-end variant containing the brand new M4 Pro SoC is also available. In this particular (Mac mini) case, the CPU and GPU core counts are, respectively, 12 and 16. Memory bandwidth is dramatically boosted, from 120 GBytes/sec with the M4 (where once again, the base memory configuration is 16 GBytes) to 273 GBytes/sec with the M4 Pro. And the M4 Pro variant is also Apple’s first (and only, at least for a day) system that supports latest-generation Thunderbolt 5. Speaking of connectors, by the way, integrated legacy USB-A is no more, though. Bring on the dongles.

MacBook Pros, the M4 Max SoC and a MacBook Air “one more thing”

Last (apparently, we shouldn’t take Apple literally when it promises an “exciting week of announcements ahead”) but definitely not least, we come to Wednesday and the unveil of upgraded 14” and 16” MacBook Pros. The smaller-screen version comes in variants based on M4, M4 Pro and brand-new M4 Max SoC flavors. This time, if you dive into the tech specs, you’ll notice that the M4 Pro is “binned” into two different silicon spins, one (as before in the Mac mini) with a 12-core CPU and 16-core GPU, and a higher-end variant with a 14-core CPU and 20-core GPU. Both M4 Pro versions deliver the same memory bandwidth—273 GBytes/sec—which definitely can’t be said about the high-end M4 Max. Here, at least on the 14” MacBook Pro, you’ll again find a 14-core CPU, although this time it’s mated to a 32-core GPU, and the memory bandwidth further upticks to 410 GBytes/sec.

If you think that’s impressive (or maybe just complicated), wait until you see the 16” MacBook Pro’s variability. There’s no baseline M4 option in this case, only two M4 Pro and two M4 Max variants. Both M4 Pro base “kits” come with the M4 Pro outfitted with a 14-core CPU and 20-core GPU. The third variant includes the aforementioned 14-core CPU/32-core GPU M4 Max. And as for the highest-end M2 Max 16” MacBook Pro? 16 CPU cores. 40 GPU cores. And 546 GBytes/sec of peak memory bandwidth. The mind boggles at the attempt at comprehension.

Speaking (one last time, I promise) of memory, what about that “one more thing” in this section’s subhead? Apple has bumped up (with one notable Walmart-only exception) the baseline memory of its existing MacBook Air mobile computers to 16 GBytes, too, at no price increase from the original 8 GByte MSRPs (or, said another way, delivering a 16 GByte price cut), bringing the entire product line to 16 GBytes minimum. I draw two fundamental conclusions:

  • Apple is “betting the farm” on memory-demanding Apple Intelligence, and
  • If I were a DRAM supplier previously worried about filling available fab capacity, I’d be loving life right about now (although, that said, I’m sure that Apple’s purchasing department is putting the screws on your profit margins at the same time).

With that, closing in on 2,000 words, I’ll sign off for now and await your thoughts in the comments!

Brian Dipert is the Editor-in-Chief of the Edge AI and Vision Alliance, and a Senior Analyst at BDTI and Editor-in-Chief of InsideDSP, the company’s online newsletter.

Related Content

googletag.cmd.push(function() { googletag.display('div-gpt-ad-native'); }); -->

The post Apple’s fall 2024 announcements: SoC and memory upgrade abundance appeared first on EDN.

Intel: Gelsinger’s foundry gamble enters crunch

EDN Network - Пн, 11/04/2024 - 14:58

Intel is at the crossroads, again, and so is its charismatic chief, Pat Gelsinger, who was brought for a turnaround of this American semiconductor giant more than three years ago. Is trouble brewing at the Santa Clara, California-based chip industry icon? According to a recent Reuters story that chronicles Gelsinger’s three years at the helm, it looks more so.

The Reuters story titled “Inside Intel, CEO Pat Gelsinger fumbled the revival of an American icon” comes soon after the news about a possible patch-up between the foundry operations of Intel and Samsung, two outfits competing with fab market leader TSMC at cutting-edge semiconductor manufacturing processes.

While the hookup between these two TSMC rivals isn’t without merits, industry watchers mostly see it as a non-starter. Samsung, which entered the foundry business in 2017, has been able to grab an 11.5% fab market share compared to TSMC’s 62.3%. Intel, on the other hand, is just at the starting gate when it comes to the foundry business it set up in 2021.

While Gelsinger sought to transform Intel by venturing into the foundry business, the chipmaker steadily lost ground to AMD in the lucrative data center processors business. Meanwhile, its bread-and-butter PC processors business is still reeling from the post-pandemic glut. But Intel’s troubles don’t end here. Another elephant in the room, besides Intel Foundry, is the struggling artificial intelligence (AI) chips business.

Apparently, Intel is late to the AI party, and just like data center processors, that puts it behind companies like AMD and Nvidia. Intel, which launched three AI initiatives in 2019, including a GPU, hasn’t much to show so far and its Gaudi AI accelerator manufactured at TSMC seems to be falling short of expectations.

Figure 1 Gaudi was touted as an alternative to Nvidia’s GPUs. Source: Intel

While Gelsinger declined to be interviewed for this Reuters story, Intel’s statements published in this special report seem to have come straight from Gelsinger’s corner office. “Pat is leading one of the largest, boldest and most consequential corporate turnarounds in American business history,” said the Intel statement. “3.5 years into the journey, we have made immense progress—and we’re going to finish the job.”

Is Gelsinger in trouble?

Intel Foundry seems to be all Gelsinger is betting on, but this premise has proven easier said than done. As Sandra Rivera, now CEO of Altera and then head of Intel’s data center business, said while talking about Intel’s GPU foray, “It’s a journey, and everything looks simpler from the outside.” This premise perfectly fits Intel’s fab gambit as well.

Soon after taking the charge, Gelsinger vowed to form a foundry business to compete with TSMC and promised to develop five manufacturing nodes in five years. However, its 18A processing node has been facing delays, and one of its early customers, Broadcom, reportedly has yield issues. A mere 20% of its chips have passed the early tests.

Intel maintains that 18A is on track for launch in 2025. But as Goldman Sachs analyst Toshiya Hari notes, semiconductor vendors have little incentive to bet on Intel’s manufacturing when TSMC continues to serve them well.

Figure 2 The news about problems with the launch of 18 processing doesn’t bode well for the company’s foundry ambitions. Source: Intel

When a large company becomes an acquisition target, it generally spells doom. So, in another statement in the Reuters story, Intel said that it won’t let merger speculation distract it from executing its five-year turnaround plan. That clearly shows the pressure and how Gelsinger is asking more time to put the house in order.

Will Gelsinger get more time? He acknowledges a lot of work ahead but is confident that Intel will pull it off. But if the foundry business betting on Intel’s chip manufacturing prowess takes longer to bear fruit, Gelsinger’s rocky tenure may end sooner than later.

Related Content

googletag.cmd.push(function() { googletag.display('div-gpt-ad-inread'); });
googletag.cmd.push(function() { googletag.display('div-gpt-ad-native'); }); -->

The post Intel: Gelsinger’s foundry gamble enters crunch appeared first on EDN.

Samsung Curved Js9000

Reddit:Electronics - Пн, 11/04/2024 - 01:20

I'm considering buying the Samsung Curved Js9000 off of FB marketplace for $180. I'm seeing this TV has great reviews for when it came out in 2015, but does it hold up with todays models? I think it's a good price, but I want to make sure it won't be out dated in the next 3-5 years

submitted by /u/Difficult_Jelly1300
[link] [comments]

Weekly discussion, complaint, and rant thread

Reddit:Electronics - Сбт, 11/02/2024 - 17:00

Open to anything, including discussions, complaints, and rants.

Sub rules do not apply, so don't bother reporting incivility, off-topic, or spam.

Reddit-wide rules do apply.

To see the newest posts, sort the comments by "new" (instead of "best" or "top").

submitted by /u/AutoModerator
[link] [comments]

“Half & Half” piezo drive algorithm tames overshoot and ringing

EDN Network - Птн, 11/01/2024 - 16:40

Piezoelectric actuators (benders, stacks, chips, etc.) are excellent fast and precise means for generation and control of micro, nano, and even atomic scale movement on millisecond and faster timescales. Unfortunately, they are also excellent high-Q resonators. Figure 1 shows what you can expect if you’re in a hurry to move a piezo and simply hit it with a unit step. Result: a massive (nearly 100%) overshoot with prolonged follow-on ringing.

Wow the engineering world with your unique design: Design Ideas Submission Guide

Figure 1 Typical piezo actuator response to squarewave drive with ringing and ~100% overshoot.

 Don’t worry. It’ll get there. Eventually. But don’t hold your breath. Clearly something has to be done to modify the drive waveshape if we’re at all interested in speed and settling time. Many possibilities exist, but Figure 2 illustrates a remarkably simple yet effective trick that actually takes advantage of the piezo’s natural 2x overshoot: Half and Half step drive.

Figure 2 Half &Half drive step with half amplitude and half resonance period kills overshoot and ringing.

 The surprisingly simple trick is to split the drive step into an initial step with half the desired movement amplitude and a duration of exactly half the piezo resonance period. Hence: “Half & Half”(H&H) drive. The half-step is then followed by application of the full step amplitude to hold the actuator in its new position.

The physics underlying H&H rely on kinetic energy imparted to the mass of the actuator during the first quarter cycle to be just sufficient to overcome actuator elasticity during the second quarter, this bringing the actuator to a graceful stop at half cycle’s end. The drive voltage is then stepped to the full value, holding the actuator stationary at the final position.

Shown in Figure 3 is H&H would work for a sequence of arbitrary piezo moves.

Figure 3 Example of three arbitrary H&H moves: (T2 – T1) = (T4 – T3) = (T6 – T5) = ½ piezo resonance period.

If implemented in software, the H&H algorithm would be simplicity itself and look something like this:

Let DAC = current contents of DAC output register
N = new content for DAC required to produce desired piezo motion
Step 1: replace DAC = (DAC + N) / 2
Step 2: wait one piezo resonance half-period
Step 3: replace DAC = N
Done

If implemented in analog circuitry, H&H might look like Figure 4. Here’s how it works.

Figure 4 The analog implementation of H&H.

 The C1, R1, C2, R2||R3 voltage divider performs the half-amplitude division function of the H&H algorithm, while dual-polarity comparators U2 detect the leading edge of each voltage step. Step detection triggers U3a, which is adjusted via the TUNE pot to have a timeout equal to half the piezo resonance period, giving us the other “half”.

U3a timeout triggers U3b, which turns on U1, outputting the full step amplitude, completing the move. The older metal gate CMOS 4066 is used due to its superior low-leakage Roff spec’ while the parallel connection of all four of its internal switches yields an adequately low Ron.

U4 is just a place keeper for a suitable piezo drive amplifier to translate from the 5-V logic of the H&H circuitry to piezo drive voltage and power levels.

Stephen Woodward’s relationship with EDN’s DI column goes back quite a long way. Over 100 submissions have been accepted since his first contribution back in 1974.

Related Content

googletag.cmd.push(function() { googletag.display('div-gpt-ad-native'); }); -->

The post “Half & Half” piezo drive algorithm tames overshoot and ringing appeared first on EDN.

Сторінки

Subscribe to Кафедра Електронної Інженерії підбірка - Новини світу мікро- та наноелектроніки