-   Українською
-   In English
Збирач потоків
Rad-hard SBC enables on-orbit computing
Moog’s Cascade single-board computer supports multiple payloads and spacecraft bus processing needs within a single radiation-hardened unit. Cascade was created through an R&D partnership with Microchip Technology, as part of NASA’s early-engagement ecosystem for its next-gen High-Performance Spaceflight Computing (HPSC) processor.
The SBC is based on Microchip’s PIC64-HPSC, a radiation-hardened microprocessor with 10 64-bit RISC-V cores. In addition to advanced computing power, the processor provides an Ethernet TSN Layer 2 switch for data communications, fault tolerance and correction, secure boot, and multiple levels of encryption.
Available with or without an enclosure, Cascade is an extended 3U SpaceVPX board that conforms to the Space Standards Open Systems Architecture (Space SOSA) standard for maximum interoperability. The rad-hard SBC can withstand a total ionizing dose (TID) of 50 krad without shielding and has a single event latchup (SEL) tolerance of 78 MeV/cm² after bootup.
For more information about the Cascade SBC, click the product page link below.
Find more datasheets on products like this one at Datasheets.com, searchable by category, part #, description, manufacturer, and more.
The post Rad-hard SBC enables on-orbit computing appeared first on EDN.
Molex shrinks busbar current sensors
Percept current sensors from Molex employ a coreless differential Hall-effect design and proprietary packaging to slash both size and weight. The sensor-in-busbar configuration allows for simple plug-and-play installation in automotive and industrial current sensing applications, such as inverters, motor drives and EV chargers.
Percept integrates an Infineon coreless magnetic current sensor in a Molex package to create a component that is 86% lighter and up to half the size of competing current sensors. The design also suppresses stray magnetic fields and reduces sensitivity and offset errors.
Automotive and industrial-grade Percept sensors are available in current ranges from ±450 A to ±1600 A, with ±2% accuracy over temperature. They offer bidirectional sensing with options for full-differential, semi-differential, and single-ended output modes. AEC-Q100 Grade 1-qualified devices operate across a temperature range of -40°C to +125°C.
Sensors for industrial applications are expected to be available in October 2024, with the automotive product approval process scheduled for the first half of 2025. Limited engineering samples for industrial applications are available now.
Find more datasheets on products like this one at Datasheets.com, searchable by category, part #, description, manufacturer, and more.
The post Molex shrinks busbar current sensors appeared first on EDN.
20-A models join buck converter lineup
TDK-Lambda expands its i7A series of non-isolated step-down DC/DC converters with seven 500-W models that provide 20 A of output current. The converters occupy a standard 1/16th brick footprint and use a standardized pin configuration.
With an input voltage range of 28 V to 60 V, the new converters offer a trimmable output of 3.3 V to 32 V and achieve up to 96% efficiency. This high efficiency reduces internal losses and allows operation in ambient temperatures ranging from -40°C to +125°C. Additionally, an adjustable current limit option helps manage stress on the converter and load during overcurrent conditions, enabling precise adjustment based on system needs.
The 20-A i7A models are available in three 34×36.8-mm mechanical configurations: low-profile open frame, integrated baseplate for conduction cooling, and integrated heatsink for convection or forced air cooling.
Samples and price quotes for the i7A series step-down converters can be requested on the product page linked below.
Find more datasheets on products like this one at Datasheets.com, searchable by category, part #, description, manufacturer, and more.
The post 20-A models join buck converter lineup appeared first on EDN.
Discrete GPU elevates in-vehicle AI
A discrete graphics processing unit (dGPU), the Arc A760A from Intel delivers high-fidelity graphics and AI-driven cockpit capabilities in high-end vehicles. According to Intel, the dGPU supports smooth and immersive AAA gaming and responsive, context-aware AI assistants.
The Arc A760A marks Intel’s entry into automotive discrete GPUs, complementing its existing portfolio of AI-enhanced, software-defined vehicle (SDV) SoCs with integrated GPUs. Together, these devices form an open and flexible platform that scales across vehicle trim levels. Automakers can start with Intel SDV SoCs and later add the dGPU to handle larger compute workloads and expand AI capabilities.
Enhanced personalization is enabled by AI algorithms that learn driver preferences, adapting cockpit settings without the need for voice commands. Automotive OEMs can transform the vehicle into a mobile office and entertainment hub with immersive 4K displays, multiscreen setups, and advanced 3D interfaces.
Intel expects the Arc A760A dGPU to be commercially deployed in vehicles as soon as 2025. Read the fact sheet here.
Find more datasheets on products like this one at Datasheets.com, searchable by category, part #, description, manufacturer, and more.
The post Discrete GPU elevates in-vehicle AI appeared first on EDN.
Raspberry Pi SBC touts RISC-V cores
The Raspberry Pi Pico 2 single-board computer is powered by the RP2350 MCU, featuring two Arm cores or optional RISC-V cores. This $5 computer board also boasts higher clock speeds, twice the memory, enhanced security, and upgraded interfacing compared to its predecessor, the Pico 1.
Designed by Raspberry Pi, the RP2350 MCU leverages a dual-core, dual-architecture with a pair of Arm Cortex-M33 cores and a pair of Hazard3 RISC-V cores. Users can select between the cores via software or by programming the on-chip OTP memory. Both the Arm and RISC-V cores run at clock speeds of up to 150 MHz.
Pico 2 offers 520 kbytes of on-chip SRAM and 4 Mbytes of onboard flash. A second-generation programmable I/O (PIO) subsystem provides 12 PIO state machines for flexible, CPU-free interfacing.
The security architecture of the Pico 2 is built around Arm TrustZone for Cortex-M and includes signed boot support, 8 kbytes of on-chip antifuse OTP memory, SHA-256 acceleration, and a true random number generator. Global bus filtering is based on Arm or RISC-V security/privilege levels.
Preorders of the Pico 2 are being accepted now through Raspberry Pi’s approved resellers. Even though Pico 2 does not offer Wi-Fi or Bluetooth connectivity, Raspberry Pi expects to ship a wireless-enabled version before the end of the year.
Find more datasheets on products like this one at Datasheets.com, searchable by category, part #, description, manufacturer, and more.
The post Raspberry Pi SBC touts RISC-V cores appeared first on EDN.
Microchip Technology Adds ECC20x and SHA10x Families of Secure Authentication ICs to TrustFLEX Platform
Pre-Configured CryptoAuthentication ICs help reduce development time and minimize design costs
Secure key provisioning is vital to protect sensitive keys against third-party tampering and malicious attacks. For securing consumer, industrial, data center and medical applications, secure key storage is essential but the process to develop and document secure key provisioning can be complex and costly. To lower the barrier of entry into secure key provisioning and enable more rapid prototyping, Microchip Technology has added the ECC204, SHA104 and SHA105 CryptoAuthentication ICs to its TrustFLEX portfolio of devices, services and tools.
ECC20x and SHA10x ICs are hardware-based, secure storage devices that are designed to keep secret keys hidden from unauthorized attacks. As part of the TrustFLEX platform, ECC204, SHA104 and SHA105 ICs are preconfigured with defined use cases, customizable cryptographic keys and code examples to streamline the development process.
“Adding the ECC20x and SHA10x pre-configured devices to our TrustFLEX platform will facilitate leveraging Microchip’s secure provisioning services for a broader set of applications,” Nuri Dagdeviren, corporate vice president of Microchip’s secure computing group. “With this platform expansion, Microchip is continuing to strengthen its portfolio, making security authentication ICs more accessible and more specifically optimized for high-volume, cost-sensitive applications.”
ECC20x and SHA10x devices meet Common Criteria Joint Interpretation Library (JIL) High rated secure key storage requirements and have been certified by the NIST Entropy Source Validation (ESV) and Cryptographic Algorithm Validation Program (CAVP) in compliance with the Federal Information Processing Standard (FIPS). The secure IC families are designed to implement trusted authentication to maintain the confidentiality, integrity and authenticity of data and communications in a wide range of systems and applications.
Microchip’s CryptoAuthentication ICs are small, low-power devices that are designed to be compatible with any microprocessors (MPUs) or microcontrollers (MCUs). They provide flexible solutions for securing industrial, medical devices, battery-powered equipment and disposable applications. Additionally, the ECC204 is a Wireless Power Consortium (WPC) approved Qi authentication Secure Storage Subsystem (SSS). Visit the Microchip website to learn more about the Trust Platform and its portfolio of security solutions.
Development ToolsECC20x and SHA10x ICs are supported by Microchip’s Trust Platform Design Suite, which provides code examples and learning materials and enables the secure transfer of credentials to more easily leverage Microchip’s secure key provisioning services. The devices are also supported by the MPLAB® X Integrated Development Environment (IDE), product-specific evaluation boards and CryptoAuthLib library support.
The post Microchip Technology Adds ECC20x and SHA10x Families of Secure Authentication ICs to TrustFLEX Platform appeared first on ELE Times.
New Vishay Intertechnology Silicon PIN Photodiode Improves Sensitivity in Biomedical Applications
Key Features Include Larger Sensitive Area of 6.0 mm², Increased Reverse Light Current, and Small Form Factor of 4.8 mm by 2.5 mm by 0.5 mm
Vishay Intertechnology, Inc. has released a new silicon PIN photodiode that brings a higher level of sensitivity in the visible / near infrared wavelength to biomedical applications such as heart rate and blood oxygen monitoring. The new VEMD8082 features increased reverse light current, decreased diode capacitance, and faster rise and fall times compared to previous-generation solutions. Additionally, its small form factor of 4.8 mm by 2.5 mm by 0.5 mm makes it suitable for integration into low profile devices such as smart watches.
The high sensitivity provided by the VEMD8082 is particularly important in biomedical applications such as photoplethysmography (PPG), where the photodiode is used to detect changes in blood volume and flow by measuring the amount of light absorbed or reflected by blood vessels. In such applications, precise measurements are crucial for diagnosing and monitoring conditions such as cardiovascular disease.
Specifications for the new device contributing to its high sensitivity compared to previous-generation devices include a radiant sensitive area of 6.0 mm² and an increase in reverse light current of 18 % to 20 %, depending on wavelength. Decreased diode capacitance from 50 pF to 46 pF, as well as faster rise times of 40 ns vs. 110 ns, allow for higher sampling rates.
Samples and production quantities of the VEMD8082 are available now.
The post New Vishay Intertechnology Silicon PIN Photodiode Improves Sensitivity in Biomedical Applications appeared first on ELE Times.
RISC-V migration to mainstream one startup at a time
As noted by Kleiner Perkins partner Mamoon Hamid, the migration to RISC-V is in full flight. Kleiner Perkins, along with Fidelity and Mayfield, is a backer of RISC-V upstart Akeana, which has officially launched itself after exiting the stealth mode.
Akeana marked this occasion by unveiling RISC-V IPs for microcontrollers, Android clusters, artificial intelligence (AI) vector cores and subsystems, and compute clusters for networking and data centers. Its 100 Series configurable processors come with 32-bit RISC-V cores and support applications spanning from MCUs to edge gateways.
Akeana’s 1000 Series processor line includes 64-bit RISC-V cores and an MMU to support rich operating systems as well as in-order or out-of-order pipelines, multi-threading, vector extension, hypervisor extension and other extensions that are part of recent and upcoming RISC-V profiles.
Next, its 5000 Series features 64-bit RISC-V cores optimized for demanding applications in data centers and cloud infrastructure. These processors are compatible with the Akeana 1000 Series but offer much higher single-thread performance.
Three RISC-V processors come alongside an SoC IP suite. Source: Akeana
Akeana feels especially confident in data center processors while having acquired the same team that designed Marvell’s ThunderX2 server chips. “Our team has a proven track record of designing world-class server chips, and we are now applying that expertise to the broader semiconductor market as we formally go to market,” said Rabin Sugumar, Akeana CEO.
Besides RISC-V processors, Akeana offers a collection of IP blocks needed to create processor system-on-chips (SoCs). That includes coherent cluster cache, I/O MMU, and interrupt controller IPs. The company also provides scalable mesh and coherence hub IP compatible with AMBA CHI to build large coherent compute subsystems for data centers and other use cases.
Akeana, another RISC-V startup challenging the semiconductor industry status quo, has been officially launched three years after its foundation. And it has raised over $100 million from A-list investors like Kleiner Perkins, Mayfield, and Fidelity.
Related Content
- Navigating the RISC-V Revolution in Europe
- Amidst export restrictions, RISC-V continues to advance
- Accelerating RISC-V development with network-on-chip IP
- RISC-V venture in Germany to accelerate design ecosystem
- RISC-V as you like it: the ‘first’ fully configurable 64-bit processor core
The post RISC-V migration to mainstream one startup at a time appeared first on EDN.
Google’s fall…err…summer launch: One-upping Apple with a sizeable product tranche
Within last month’s hands-on (or is that on-wrist?) coverage of Google’s first-generation Pixel Watch, I alluded to:
…rumors of the third-generation offering already beginning to circulate…
Later in that writeup, I further elaborated:
…the upcoming Pixel Watch 3, per leaked images, will not only be thicker but also come in a larger-face variant.
I (fed by the rumor mill, in my defense) was mostly right, as it turns out. The “upcoming” Pixel Watch 3 was released today (as I write these words on Tuesday, August 13). It does come in both legacy 41 mm and new, larger 45 mm flavors. And, in a twist I hadn’t foreseen, the bezel real estate is decreased by 16%, freeing up even more usable space with both screen sizes (also claimed to be twice as bright as before). But they’re not thicker than before; they’ve got the same 12.3 mm depth as that of the second-gen precursor. And anyway, I’m getting ahead of myself, as today’s live event (including live demos, full of jabs at competitor Apple’s seeming scripted, pre-recorded preferences in recent years):
was predated by several notable preparatory press release-only announcements last week.
4th-generation Nest learning thermostatI’ve got a Lennox “smart” system, so don’t have personal experience with Nest (now Google Nest) gear, but I know a lot of folks who swear by it, so for them the latest-generation addition will likely be exciting. Aside from various cosmetic and other aesthetic tweaks, it’s AI-centric, which won’t surprise anyone who saw the Google I/O keynote (or my coverage of it):
With the Nest Learning Thermostat we introduced the concept of intelligence to help you save energy and money. It keeps you comfortable while you’re home and switches to an energy-efficient temperature when you’re away. And the Nest Learning Thermostat (4th gen) is our smartest, most advanced thermostat yet.
It uses AI to automatically make micro-adjustments based on your patterns to keep you comfortable while saving both energy and money. Now, AI can more quickly and accurately create your personalized, energy-saving temperature schedules. With Smart Schedule, the thermostat learns which temperatures you choose most often or changes in behavior based on motion detected in your home — like coming home earlier — and automatically adjusts your temperature schedule to match. These energy-saving suggestions can be implemented automatically, or you can accept or reject them in the Google Home app so you’re always in control.
The thermostat also analyzes how the weather outdoors will affect the temperature inside. For example, if it’s a sunny winter day and your home gets warmer on its own, it will pause heating. Or, on a humid day, the indoor temperature may feel warmer than intended, so the thermostat will adjust accordingly.
TV StreamerThis once was admittedly something of a surprise, albeit less so in retrospect. Google is end-of-lifeing its entire Chromecast product line, replacing it with the high-end TV Streamer, which not only handles traditional audio and video reception-and-output tasks but also, for example, does double-duty as a “smart home” hub for Google Home and Matter devices. The reason why it wasn’t a complete surprise was that, as I’d mentioned before, the existing Chromecast with Google TV hardware was getting a bit long in the tooth, understandable given that the original 4K variant dated from September 2020 with the lower-priced FHD version only two years newer.
With its Chromecast line, Google has always strived to deliver not only an easy means of streaming Internet-sourced content to (and displaying it on) a traditional “dumb” TV but also a way to upgrade the conventional buggy, sloth-like “smart” TV experience. As “smart” TVs have beefed up their hardware and associated software over the past four years, however, the gap between them and the Chromecast with Google TV narrowed and, in some cases, collapsed and even flipped. That said, I still wonder why the company decided to make a clean break from the longstanding Chromecast brand equity investment versus, say, calling it the “Chromecast 2”.
The other factor, I’m guessing, has at least something to do with comments I made in my recent teardown of a Walmart Android TV-based onn. UHD streaming device:
Walmart? Why?… I suspect it has at least something to do with the Android TV-then-Google TV commodity software foundation on which Google’s own Chromecast with Google TV series along with the TiVo box I tore down for March 2024 publication (for example) are also based, which also allows for generic hardware. Combine that with a widespread distribution network:
Today, Walmart operates more than 10,500 stores and clubs in 19 countries and eCommerce websites.
And a compelling (translation: impulse purchase candidate) price point ($30 at intro, vs $20 more for the comparable-resolution 4K variant of Google’s own Chromecast with Google TV). And you’ve got, I suspect Walmart executives were thinking, a winner on your hands.
Competing against a foundation-software partner who’s focused on volume at the expense of per-unit profit (even willing to sell “loss leaders” in some cases, to get customers in stores and on the website in the hopes that they’ll also buy other, more lucrative items while they’re there) is a tough business for Google to be in, I suspect. Therefore, the pivot to the high end, letting its partners handle the volume market while being content with the high-profit segment. This is a “pivot” that you’ll also see evidence of in products the company announced this week. To wit…
The Pixel 9 smartphone seriesNow’s as good a time as any to discuss the “elephant in the room”. Doesn’t Apple generally (but not always) release new iPhones (and other things) every September? And doesn’t Google generally counter with its own smartphone announcements (not counting “a” variants) roughly one month later? Indeed. But this time, Mountain View-headquartered Google apparently decided to get the jump on its Cupertino-based Silicon Valley competitor (who I anticipate will once again unveil new iPhones next month; as always, stay tuned for my coverage!).
Therefore the “One-upping Apple” phrase in this post’s title. And my already mentioned repeated snark from Google regarding live-vs- pre-recorded events (and demos at such). Along with plenty of other examples. That said, Google wasn’t above mimicking its competitor, either. Check out the first-time (versus historically curved) flat edges in the above video. Where have you seen them (plenty of times) before? Hmmm? That said, Pixels’ back panel camera bar (which I’ll cover more in an upcoming post) currently remains a Google-only characteristic.
Another Apple-reminiscent form factor evolutionary adaptation also bears mentioning. Through the Pixel 4 family generation, Google sold both standard and large screen “XL” smartphone variants. The “XL” option disappeared with the Pixel 5. In its stead, starting with the Pixel 6, a “Pro” version arrived…a larger screen size than the standard, as in the “XL” past, but also accompanied by a more elaborate multi-camera arrangement, among other enhancements.
And now with the Pixel 9 generation, there are two “Pro” versions, both standard (6.3” diagonal, the same size as the conventional Pixel 9, which has grown a bit from the 6.2” Pixel 8 predecessor) and resurrected large screen “XL” (6.8” diagonal). Remember my earlier comments about media streamers: how Google was seemingly doing a “pivot to the high end, letting its partners handle the volume market while being content with the high-profit segment”? Sound familiar? This is also reminiscent of how Apple dropped its small-screen “mini” variant after only one generation (13).
Even putting aside standard-vs-“Pro” and standard-vs-large screen product line proliferation prioritization by Google, the overall product line pricing has also increased. The Pixel 7 phones that I’m currently using, for example, started at $599, with the year-later (and year-ago) Pixel 8 starting at $699; the newly unveiled Pixel 9 successor begins at $799. That said, in fairness, you now get 50% more RAM (12 vs 8 GBytes). Further to that point, especially given that the associated software suite is increasingly AI-enhanced (yes, Google Assistant is being replaced by Gemini, including the voice-based and Alexa- and Siri-reminiscent Gemini Live), Google isn’t making the same mistake it initially did with the Pixel 8 line.
At intro, only the 12 GByte RAM-inclusive “Pro” version of the Pixel 8 was claimed capable of supporting Google’s various Gemini deep learning models; the company later rolled out “Nano” Gemini variants that could shoehorn in the Pixel 8’s 8 GBytes. This time, both the Pixel 9 (again, 12 GBytes) and Pixel 9 Pro/Pro XL (16 GBytes) are good to go. And I suspect Apple’s going to be similarly memory-inclusive from an AI (branded Apple Intelligence, in this case) standpoint with its upcoming iPhone 16 product line, given that its current-generation AI support is comparably restrictive, to only the highest-end iPhone 15 Pro and Pro Max.
Accompanying the new-generation phones is, unsurprisingly, a new-generation SoC powering them: the Tensor G4. As usual, beyond Google’s nebulous claim that “It’s our most efficient chip yet,” we’ll need to wait for Geekbench leaks, followed by actual hands-on testing results, to assess performance, power consumption and other metrics, both in an absolute sense and relative to precursor generations and competitors’ alternatives. They all (like Apple for several years now) come with two years of gratis satellite SOS service, which is nice. And they all also, after three generations’ worth of oft-frustrating conventional under-display fingerprint sensor usage (oh, how I miss the back panel fingerprint sensors of the Pixel 5 and precursors!), switch to a hopefully more reliable ultrasonic sensor approach (already successfully in use in Samsung Galaxy devices, which is an encouraging sign).
That said, the displays themselves differ between standard and “Pro” models: the Pixel 9 has a 6.3-inch OLED with 2,424×1,080-pixel resolution (422 pixels per inch, i.e., ppi, density) and 60-120 Hz variable refresh rate, while its same-size Pixel 9 Pro sibling totes a 2,856×1,280-pixel resolution (495 ppi density) and its low-temperature polycrystalline oxide (LTPO) OLED affords an even broader 1-120 Hz variable refresh rate range to optimize battery life. The Pixel 9 Pro XL’s display is also LPTO OLED in nature, this time with a 2,992×1,344-pixel resolution (486 ppi density). And where the phones also differ, among other things (some already mentioned) and speaking of AI enhancements, is in their front and rear camera allotments and specifications. With the Pixel 9, you get two rear cameras—50 Mpixel main and 48 Mpixel ultrawide—along with a 10.5 Mpixel front-facing. The Pixel 9 Pro and Pro XL add a third rear camera, a 48 Mpixel telephoto with 5x optical zoom, as well as bumping up the front camera to 42 Mpixel resolution. And for examples of some of the new and enhanced AI-enabled computational photography capabilities, check out this coverage, along with a first-look video from Becca, at The Verge:
Video Boost’s cloud-processed smoother transitions between lenses, dealing with an issue whose root cause I also discuss in the aforementioned upcoming blog post, is very cool.
In closing (at least for this section), a few words on the Pixel 9 Fold Pro, the successor to last year’s initial Pixel Fold. Read through The Verge’s recently published long-term usage report on the first-generation device (or Engadget’s version of the same theme, for that matter), and you’ll see that one key hoped-for improvement with its successor was increased display brightness. Well, Google delivered here, claiming that it’s “80% brighter than Pixel Fold”.
The Pixel 9 Fold Pro also inherits other Pixel 9 series improvements, such as to the SoC and camera subsystem. After some initial glitches, Google seems to have solved the Pixel Fold’s screen reliability issue, a key characteristic that I assume will carry forward to the second generation. And the company’s also currently offering a generous first-generation trade-in offer, although you’ll still be shelling out $1,000+ for the second-gen upgrade. That all said, as I read through the coverage of both generation foldable devices, I can’t help but wonder what could have also been, had Google and Microsoft more effectively worked together to harness the Surface Duo’s hardware potential with equally robust (and long-supported) software. Sigh.
Smart watchesI already teased the new Pixel Watch 3 variants at the beginning of the piece, specifically with respect to their dimensions and display characteristics. Interestingly, they run the same SoC as that found in second-generation predecessors, Qualcomm’s Snapdragon SW5100, and have the RAM allocation, 2 GBytes. The new Loss of Pulse Detection capability is compelling, specifically its multi-sensor implementation that strives to prevent “false positives”:
Loss of Pulse Detection combines signals from Pixel Watch 3’s sensors, AI and signal-processing algorithms to detect loss of pulse events, with a thoughtful design — built from user research — to limit false alarms. The feature uses signals from the Pixel Watch 3’s existing Heart Rate sensor, which uses green light to check for a user’s pulse.
If the feature detects signs of pulselessness, infrared and red lights also turn on, looking for additional signs of a pulse, while the motion sensor starts to look for movement. An AI-based algorithm brings together the pulse and movement signals to confirm a loss of pulse event, and if so, triggers a check-in to see if you respond.
The check-in asks if you’re OK while also looking for motion. If you don’t respond and no motion is detected, it escalates to an audio alarm and countdown. If you don’t respond to the countdown, the LTE watch or phone your watch is connected to automatically places a call to emergency services, and shares an automated message that no pulse is detected along with your location.
And unlike the Pixel 9 smartphones, which will still be “stuck” on Android 14 at initial shipment (Android 15 is still in beta, obviously), the new watches will ship with Wear OS 5, whose claimed power consumption improvements I’m anxious to see in action on my first-gen Pixel Watch, too. Speaking of which, that “free two years of Google Fi-served wireless service for LTE watches” short-term promotion that I’d told you I snagged? It’s now broadly available.
I should note that Google launched a new smartwatch last week, too, the kid-tailored Fitbit Ace LTE. But what about broader-audience new Fitbit smartwatches? Apparently, the Pixel Watch family is the solitary path going forward here; in the future, Google will be refocusing the Fitbit brand specifically at lower-end activity tracker-only devices.
EarbudsLast (but not least), what’s up with Google and earbuds? At the beginning of last year, within a teardown of the first-generation Pixel Buds Pro, I also gave a short historical summary of Google’s to-date activity in this particular product space. Well, two years after the initial Pixel Buds Pro series rolled out, the second-generation successors are here. They check all the predictable improvement-claim boxes:
- “Twice as much noise cancellation”
- “24% lighter and 27% smaller”
- “increase[d]…battery life, despite the smaller and lighter design. When Active Noise Cancellation is enabled, you get up to 8 hours”
And, for the first time, they’re powered by Google-designed silicon, the Tensor A1 SoC (once again reminiscent of Apple’s in-house supply chain strategy). That said, I was happily surprised to see that the “wings” designed to help keep the earbuds in place when in use have returned, albeit thankfully in more subdued form than that in the standard Pixel Buds implementation:
Although I’m blessed to own several sets of earbuds from multiple suppliers, I inevitably find myself repeatedly grabbing my first-gen Pixel Buds Pros in all but critical-music-listening situations. However, even when all I’m doing is weekend vacuuming, they don’t stay firmly in place. The second-gen “wings” should help here. Hear? (abundant apologies for the bad pun).
Having just passed through 2,500 words, I’m going to wrap up at this point and pass the cyber-pen over to you all for your thoughts in the comments!
—Brian Dipert is the Editor-in-Chief of the Edge AI and Vision Alliance, and a Senior Analyst at BDTI and Editor-in-Chief of InsideDSP, the company’s online newsletter.
Related Content
- Playin’ with Google’s Pixel 7
- Pixel smartphone progressions: Latest offerings and personal transitions
- The 2024 Google I/O: It’s (pretty much) all about AI progress, if you didn’t already guess
- The Google Chromecast with Google TV: Realizing a resurrection opportunity
- Google’s Chromecast with Google TV: Car accessory similarity, and a post-teardown resurrection opportunity?
- The Pixel Watch: An Apple alternative with Google’s (and Fitbit’s) personal touch
- Walmart’s onn. UHD streaming device: Android TV at a compelling price
The post Google’s fall…err…summer launch: One-upping Apple with a sizeable product tranche appeared first on EDN.