Українською
  In English
Новини світу мікро- та наноелектроніки
CNC Soldering Definition, Process, Working, Uses & Advantages
CNC (Computer Numerical Control) soldering is an automated process where soldering tasks are performed using a programmable CNC machine. It allows precise control over the soldering process, ideal for repetitive and intricate soldering tasks in electronic manufacturing. The CNC system guides the soldering tool along pre-defined paths to achieve accurate solder joints.
How CNC Soldering Works:- Design Input: The process starts with CAD/CAM software to create a digital design of the soldering task.
- Programming: The design is converted into machine-readable G-code, which guides the CNC soldering machine.
- Machine Setup: The soldering tool (e.g., soldering iron, laser, or ultrasonic tool) is mounted on the CNC arm.
- Execution: The machine follows the programmed path, precisely applying solder to designated areas, ensuring consistent quality.
- Inspection: Automated or manual inspection ensures solder joints meet required standards.
- Preparation:
- Load the components and PCB (Printed Circuit Board) onto the machine.
- Input the soldering design and settings.
- Heating and Solder Application:
- The CNC tool applies heat to the solder and component leads.
- Solder flows to form a secure joint.
- Cooling:
- The joint is allowed to cool naturally or with cooling systems to solidify.
- Quality Check:
- Joints are inspected for accuracy and integrity.
- Electronics Manufacturing: Ideal for PCBs in consumer electronics, automotive electronics, and medical devices.
- Prototype Development: Rapid soldering of prototype boards with consistent quality.
- Aerospace and Defense: Precise soldering for high-reliability applications.
- LED Assembly: Used for accurate placement and soldering of LED components.
- Telecommunications: Efficient soldering of intricate circuit boards for communication devices.
- Precision: Ensures accurate soldering with minimal errors.
- Consistency: Delivers uniform quality across all joints.
- Speed: Automates repetitive tasks, reducing production time.
- Scalability: Suitable for both small-scale and mass production.
- Safety: Minimizes manual handling, reducing risks to operators.
- Versatility: Compatible with various soldering tools and techniques, including laser and ultrasonic soldering.
- High Initial Cost: Significant investment in CNC machines and setup.
- Complex Setup: Requires skilled personnel for programming and maintenance.
- Limited Flexibility: Less adaptable to on-the-fly changes compared to manual soldering.
- Material Compatibility: May not suit all types of soldering materials or components.
- Maintenance Requirements: Machines need regular calibration and upkeep.
CNC soldering is a cornerstone of modern electronics manufacturing, combining efficiency and precision while offering cost-effective solutions for high-quality soldering needs.
The post CNC Soldering Definition, Process, Working, Uses & Advantages appeared first on ELE Times.
Wifi Light Switch Meaning, Types, Working, Benefits & Applications
A Wi-Fi light switch is a smart device that allows you to control your lights remotely using a smartphone, voice assistant, or automation system. It connects to your home Wi-Fi network, enabling you to manage lighting from anywhere via an app or compatible smart home ecosystem.
Types of Wi-Fi Light Switches:- Standard Wi-Fi Switch: Replaces traditional light switches, offering basic on/off functionality remotely.
- Dimmer Wi-Fi Switch: Allows you to adjust light brightness levels.
- Multi-Way Wi-Fi Switch: Designed for controlling lights from multiple locations, such as at the top and bottom of stairs.
- Touch Wi-Fi Switch: Features a sleek touch-sensitive panel for manual control.
- Voice-Activated Switch: Compatible with voice assistants like Alexa, Google Assistant, or Siri.
- Switch with Scheduling: Includes timers and scheduling features for automated lighting.
- Connectivity: Connects to your home Wi-Fi network.
- Integration: Controlled through a smartphone app or smart home system.
- Signal Transmission: Commands are sent via Wi-Fi to the switch, triggering the connected lights.
- Additional Features: May include energy monitoring, scenes, and integration with other smart devices.
- Home Automation: Seamlessly integrate with other smart devices for automated lighting.
- Energy Efficiency: Optimize usage by controlling lights remotely.
- Enhanced Security: Schedule lights to mimic occupancy when you’re away.
- Convenience: Adjust lighting without needing to be in the same room.
- Customizable Ambiance: Use dimmer switches to set the perfect mood.
- Install the Switch: Replace your standard switch following manufacturer instructions (may require neutral wire).
- Connect to Wi-Fi: Use the app to link the switch to your home network.
- Configure Settings: Customize lighting schedules, dimming levels, and smart home integration.
- Control via App or Voice: Use the app or a voice assistant for on-the-go operation.
- Remote Control: Manage lighting system from anywhere using a smartphone.
- Energy Savings: Optimize usage and reduce unnecessary energy consumption.
- Convenience: No need to physically interact with switches.
- Customization: Create schedules, scenes, and automations tailored to your lifestyle.
- Scalability: Expand your smart home system with additional switches and devices.
- Enhanced Aesthetics: Sleek designs complement modern interiors.
Wi-Fi light switches are a perfect blend of functionality, efficiency, and style, making them a great addition to any smart home.
The post Wifi Light Switch Meaning, Types, Working, Benefits & Applications appeared first on ELE Times.
India Budget 2025: Tech Sector Eyes Strategic Investments and Policy Reforms for Growth and Innovation
The upcoming Union Budget 2025 is generating significant anticipation within India’s tech ecosystem, as stakeholders call for increased investments and policy reforms to drive innovation, self-reliance, and global competitiveness.
Key Expectations:
- Semiconductor Ecosystem Boost: Industry leaders expect the government to announce additional incentives under the Production Linked Incentive (PLI) scheme to accelerate India’s semiconductor manufacturing ambitions, especially in light of ongoing global chip shortages.
- AI and Emerging Technologies: Increased funding for AI, blockchain, quantum computing, and IoT is sought, alongside initiatives to establish India as a global hub for AI research and innovation. Stakeholders also anticipate a clear roadmap for ethical AI governance.
- Data Localization and Cybersecurity: Strengthened support for data infrastructure, cybersecurity frameworks, and compliance with the Digital Personal Data Protection (DPDP) Act, ensuring both security and ease of doing business.
- Startup Ecosystem Support: Startups are hoping for relaxed compliance norms, enhanced funding avenues, and tax reliefs to stimulate innovation and attract global venture capital.
- Digital Public Infrastructure: Expectations include investments in 5G rollout, public digital platforms, and rural broadband expansion to bridge the digital divide and foster inclusivity.
- Green Tech Focus: Tech players anticipate policies promoting green energy solutions, energy-efficient data centers, and EV technology development.
The Budget 2025 is expected to solidify India’s position as a global tech leader while addressing long-term growth goals and technological self-reliance.
The post India Budget 2025: Tech Sector Eyes Strategic Investments and Policy Reforms for Growth and Innovation appeared first on ELE Times.
Government May Extend Feedback Deadline for DPDP Act Draft Rules to Ensure Comprehensive Input
The Indian government is considering extending the deadline for public feedback on the draft rules of the Digital Personal Data Protection (DPDP) Act by an additional two weeks beyond the initial February 18, 2025, cutoff. This potential extension aims to provide stakeholders with more time to thoroughly review and comment on the proposed regulations.
The Ministry of Electronics and Information Technology (MeitY) released the draft rules earlier this month, initiating a 45-day period for public consultation. In a recent industry meeting held in New Delhi, Union Electronics and Information Technology Minister Ashwini Vaishnaw emphasized the government’s commitment to conducting comprehensive consultations to ensure all stakeholder perspectives are considered. The objective is to balance innovation with regulation, fostering an environment conducive to technological growth while safeguarding personal data.
Attendees of the consultation included representatives from major technology companies such as Snap, Google, Meta, and OpenAI, as well as industry bodies like Nasscom, Broadband India Forum, and the Data Security Council of India (DSCI). Key concerns raised during the discussions encompassed the potential compliance burdens associated with issuing new notices to existing users and the implications of proposed data localization requirements, which some fear may conflict with international regulations.
The DPDP Act, enacted in August 2023, seeks to establish a comprehensive framework for the processing of digital personal data in India, balancing individual privacy rights with the necessity for lawful data processing. The draft rules detail provisions related to data fiduciary responsibilities, consent management, data breach notifications, and the processing of children’s personal data, among other aspects.
Stakeholders are encouraged to utilize the extended consultation period to provide detailed feedback, ensuring the final regulations effectively address the diverse interests and concerns within India’s digital ecosystem.
The post Government May Extend Feedback Deadline for DPDP Act Draft Rules to Ensure Comprehensive Input appeared first on ELE Times.
Microsoft Commits $3 Billion to Expand AI and Cloud Infrastructure in India, Aiming to Train 10 Million by 2030
Microsoft has announced a $3 billion investment to expand its Azure cloud and artificial intelligence (AI) capacities in India over the next two years. This move emphasizes India’s importance as a key growth market for technology, given its expertise and cost-effectiveness. Additionally, the investment will focus on upskilling Indian professionals in AI, building on Microsoft’s plan to invest $80 billion in AI-enabled data centers. Microsoft CEO Satya Nadella highlighted India’s significant contributions to AI projects, particularly through GitHub Copilot, and noted that India is projected to have the largest developer community by 2028. Microsoft plans to train 10 million people in AI by 2030, following the upskilling of 2.4 million individuals last year. The investment reflects the ongoing competition among U.S. tech giants to capture and nurture technological talent in India.
This strategic expansion aligns with Microsoft’s broader vision to support India’s growing digital economy and underscores the nation’s pivotal role in the global technology landscape. By enhancing its infrastructure and focusing on skill development, Microsoft aims to empower individuals and organizations across India, fostering innovation and contributing to the country’s long-term competitiveness in the AI domain.
The post Microsoft Commits $3 Billion to Expand AI and Cloud Infrastructure in India, Aiming to Train 10 Million by 2030 appeared first on ELE Times.
India to Host Over 620 Global Capability Centres by 2030, Driving $105 Billion Market Growth
India is poised to host over 620 Global Capability Centres (GCCs) of Forbes Global 2000 companies by 2030, marking a nearly 40% increase from the current 450 companies operating 825 such centres, according to a study by consulting firm ANSR.
This expansion is expected to boost the talent base at these GCCs by 45%, reaching 1.9 million professionals. Notably, 45% of existing GCCs have expanded their operations across multiple Indian cities. A significant majority are focusing on advanced technologies: 85% on artificial intelligence and data analytics, 80% on cloud computing, 75% on robotics process automation, 70% on digital commerce and cybersecurity, and 45% on emerging technologies like blockchain, augmented and virtual reality, and the Internet of Things.
This growth underscores India’s evolution from a low-cost outsourcing hub to a critical operational center for global companies, driven by its substantial talent pool and mature offshoring ecosystem. The GCC market in India is projected to reach $105 billion by 2030, up from $64.6 billion in fiscal 2024.
The post India to Host Over 620 Global Capability Centres by 2030, Driving $105 Billion Market Growth appeared first on ELE Times.
So it begins...
![]() | submitted by /u/Conlan99 [link] [comments] |
Creating an instrument cluster for a custom Mini with a clever take on the "Printed Circuit Board" (Project Binky)
![]() | submitted by /u/zyzzogeton [link] [comments] |
MACOM unveils five-year, $345m plan to expand 100mm GaN and GaAs production and introduce 150mm GaN
Veeco tightens revenue and EPS guidance ranges for Q4 and full-year 2024
Part 1: A beginner’s guide to the power of IQ data and beauty of negative frequencies

Editor’s Note: This is a two-part series where DI authors Damian and Phoenix Bonicatto explore IQ signal representation and negative frequencies to ease the understanding and development of SDRs.
Part 1 explains the commonly used SDR IQ signal representation and negative frequencies without the complexity of math.
Part 2 (to be published) presents a device that allows you to play with and display live SDR signal spectrums with negative frequencies.
IntroductionSoftware-defined radio (SDR) firmware makes extensive use of I/Q representation of the received and transmitted signal. This representation can simplify and add ease to the manipulation of incoming signal. I/Q data also allows us to work with negative frequencies. My goal here is to explain the I/Q representation and negative frequencies without the complexity usually invoked by obscure terms and non-intuitive mathematics. Also, I will present a device that you can build to allow you to play with and display live spectrums with negative frequencies. So, let’s get started.
Wow the engineering world with your unique design: Design Ideas Submission Guide
I/Q and quadrature conceptsWhat is I/Q data? “I” is short for in-phase and “Q” is short for quadrature. It’s the first set of SDR terms that sound mysterious and tends to put people off—let’s just call them I and Q. Simply, if you have a waveform, like you see on an oscilloscope, you can break it into two sinusoidal components—one based on a sine, and another based on a cosine. This is done by using the trig “angle sum identity”. The I and Q are the amplitudes of these components, so our signal is now represented as:
Where: “A” is the original signal amplitude and:
We have just created the in-phase signal, I*cos(ωt), and the quadrature signal, Q*sin(ωt). Just to add to the confusion, when we deal with the in-phase and quadrature signals together it is referred to as “quadrature signaling” …sigh.
[Note: In SDR projects IQ data (or I/Q data) is generally referring to the digital data pairs at each sample interval.]
Most signal processing textbooks work with exponentials to describe and manipulate signals. For example, a transmitted signal is always “real” and is typically shown as something like:
This is another formula that creates obfuscation and puts off people just starting out in signal processing and SDR. I will say that exponential notation creates cleaner mathematical manipulation, but my preference is to use the trig representation as I can see the signal in my mind’s eye as I manipulate the equations. Also, explaining your design to people who are not up on signal processing is much easier when using things everyone learned in high school. Note that, although most SDR simulations tools like MATLAB use the exponential for signal processing work, when it comes down to writing C code in an MCU, the trig representation is normally used.
Without going into it, this exponential representation is based on Euler’s formula, which is related to the beautiful and cleverly derived Euler’s equation.
Now, you may wonder why we would go through the trouble to convert the data to this quadrature form and what this form of the signal is good for. In receivers, for example, just using the incoming signal and mixing it with another frequency and extracting the data has worked since the early days of radio. To answer this, let’s look at a couple of examples.
Example of the benefits of quadrature formFirst, when doing simple mixing of an incoming signal you get, as an output, two signals—the sum of the incoming signal and the mix frequency, and the difference of these two frequencies. The following equation demonstrates this by use for the trig product identity:
To continue in your receiver, you typically need to filter one of these out, usually the higher frequency. (The unwanted resultant frequency is often called the image frequency, which is removed by an image filter.) In a digital receiver this filter can take some valuable resources (cycles and memory). Using the I/Q form above, a mix can be created that removes either just the sum or just the difference without filtering.
You can see how this works in Figure 1. First, define the mix signal in an I/Q format:
Mix Signal I part = cos(ωmt)
Mix Signal Q part = sin(ωmt)
Figure 1 Quadrature (complex-to-complex) mix returning the lower frequency.
(There is more to this, but this mix architecture is the basic idea of this technique.)
You can see that only the lower frequency is output from the mixer. If you want the higher frequency and to remove the lower frequency, just change where the minus sign is in the final additions as shown in Figure 2.
Figure 2 Quadrature mix returning the higher frequency.
This quadrature, or complex-to-complex, mixing is a very powerful technique in SDR designs.
Next, let’s look at how I/Q data can allow us to play with negative frequencies.
When you perform a classical (non-quadrature) mix, any result that you get cannot go below a frequency of zero. The result will be two new frequencies: the sum of the input frequencies and the absolute value of the difference. This absolute value means the output frequencies cannot go negative. In a quadrature mixer the frequency is not constrained with an absolute value function, and you can get negative frequencies.
Let’s think about what this means if you are sweeping one of the inputs. In the classical mixer as the two input frequencies approach each other, the difference frequency will approach 0 Hz and then start to go back up in frequency. In a quadrature mixer the difference frequency will go right through 0 Hz and continue getting more and more negative.
One implication of this is that, in a sampled system you’re working on, bandwidth is the sample rate divided by 2. When using a quadrature representation, you have a working bandwidth that is twice as large. This is especially handy when you have a system where you want to deal with a large range of frequencies at a time. You can move any of the frequencies to baseband; the higher frequencies will stay in their relative position in the positive frequencies; and the lower frequencies will stay in their relative positions in the negative frequencies. You can slide up and down, by mixing, without image filters or corrupting the spectrum with images. Another very powerful technique in SDR designs.
A tool for exploring IQ dataThis positive and negative spectrum is very interesting but unfortunately the basic FFT on your oscilloscope probably won’t display them. It typically only displays positive frequencies. Vector network analyzers (VNAs) can display negative frequency but not all labs have one. You can play around in tools like MATLAB, but I usually like something a little closer to actual hardware and more real-time to get a better feel for the concept. A signal generator and a scope always help me. But I already said a scope does not display negative frequency. Well, the tool presented in Part 2 will allow us to play with I/Q data, negative frequencies, and mixing.
[Editor’s Note: An Arduino-Nano-based device will be presented in Part 2 that can generate IQ samples based upon user frequency, amplitude, and phase settings. This generated data will then display the spectrum showing both positive and negative frequencies. Stay tuned for more!]
Damian Bonicatto is a consulting engineer with decades of experience in embedded hardware, firmware, and system design. He holds over 30 patents.
Phoenix Bonicatto is a freelance writer.
Related Content
- Exploring software-defined radio (without the annoying RF) – Part 1
- Exploring software-defined radio (without the annoying RF)—Part 2
- SDR Basics Part 3: Transmitters
- The virtual reality of 5G – Part 2 (measurements)
- Ultra-wideband I/Q demodulator improves receiver performance
The post Part 1: A beginner’s guide to the power of IQ data and beauty of negative frequencies appeared first on EDN.
High-breakdown-voltage P-GaN gate HEMTs with threshold voltage of 7.1V
Voskhod 6n1p Glow
![]() | Yep not the best measurement whise but for my audio application it sounds pretty decent. And it’s one of the prettiest heaters there is for ECC88/6N1P [link] [comments] |
Sivers signs CHIPS Act contracts with Northeast Microelectronics Coalition Hub
Micro-LED display market to grow to 34.6 million units by 2031
The 2025 CES: Safety, Longevity and Interoperability Remain a Mess

Once again this year, I’m thankfully reporting on CES (formerly also known by its de-acronym’d “Consumer Electronics Show” moniker, although the longer-winded version is apparently no more) from the remote comfort of my home office. There are admittedly worse places to visit than Las Vegas, especially given its newfound coolness courtesy of the Sphere (which I sadly have yet to experience personally):
That said, given the option to remain here, I’ll take it any day, realizing as I say this that it precludes on-camera cameos…which, come to think of it, is a plus for both viewers and myself!
(great job, Aalyia!)
Anyhoo, I could spend the next few thousand words (I’m currently guesstimating, based on repeated past experience, which in some years even necessitated a multi-part writeup series), telling you about all the new and not-new-but-maturing products and technologies showcased at the show. I’ll still do some of that, in part as case study examples of bigger-picture themes. But, to the title of this writeup, this year I wanted to start by stepping back and discussing three overriding themes that tainted (at least in my mind) all the announcements.
Safety
(Who among you is, like me, old enough to recognize this image’s source without cheating by clicking through first?)
A decade-plus ago, I told you the tale of my remote residence-located Linksys router that had become malware-infected:
Ever since then, I’ve made it a point to collect news tidbits on vulnerabilities and the attack vectors that subsequently exploit them, along with manufacturers’ subpar compromise responses. It likely won’t surprise you to learn that the rate of stories I’ve accumulated has only accelerated over time, as well as broadened beyond routers to encompass other LAN and WAN-connected products. I showcased some of them in two-part coverage published five years ago, for example, and disassembled another (a “cloud”-connected NAS) just a few months back.
The insecure-software situation has become so rampant, in fact, that the U.S. Federal Communications Committee (FCC) just unveiled a new program and associated label, the U.S. Cyber Trust Mark, intended to (as TechCrunch describes it) “help consumers make more informed decisions about the cybersecurity of the internet-connected products they bring into their homes.” Here’s more, from Slashdot’s pickup of the news, specifically referencing BleepingComputer’s analysis:
It’s designed for consumer smart devices, such as home security cameras, TVs, internet-connected appliances, fitness trackers, climate control systems, and baby monitors, and it signals that the internet-connected device comes with a set of security features approved by the National Institute of Standards and Technology (NIST). Vendors will label their products with the Cyber Trust Mark logo if they meet NIST cybersecurity criteria. These criteria include using unique and strong default passwords, software updates, data protection, and incident detection capabilities. Consumers can scan the QR code included next to the Cyber Trust Mark labels for additional security information, such as instructions on changing the default password, steps for securely configuring the device, details on automatic updates (including how to access them if they are not automatic), the product’s minimum support period, and a notification if the manufacturer does not offer updates for the device.
Candidly, I’m skeptical that this program will be successful, even if it survives the upcoming Presidential administration transition (speaking of which: looming trade war fears weighed heavily on folks’ minds at the show) and in spite of my admiration for its honorable intention. As reader “Thinking_J” pointed out in response to my recent teardown of a Bluetooth receiver that has undergone at least one mid-life internal-circuits switcheroo, the FCC essentially operates on the “honor system” in this and similar regards after manufacturers gain initial certification.
One of the root causes of such vulnerabilities, IMHO, is any reliance on open-source code, no matter that doing so may ironically also improve initial software quality. Requoting myself:
Open-source software has some compelling selling points. For one thing, it’s free, and the many thousands of developer eyeballs peering over it generally result in robust code. When a vulnerability is discovered, those same developers quickly fix it. But among those thousands of eyeballs are sets with more nefarious objectives in mind, and access to source code enables them to develop exploits for unpatched, easily identified software builds.
I also suspect that at least some amount of laissez-faire tends to creep into the software-development process when you adopt someone else’s code versus developing your own, especially if you subsequently “forget” to make proper attribution and take other appropriate action regarding that adoption. The result is a tendency to overlook the need to maintain that portion of the codebase as exploits and broader bugs in it are discovered and dealt with by the developer community or, more often than note, the one-and-only developer.
Sometimes, though, code-update neglect is intentional:
Consumer electronics manufacturers as a rule make scant (if any) profit on each unit sold, especially after subtracting the “percentage” taken by retailer intermediaries. Revenue tangibly accrues only as a function of unit volume, not from per-unit profit margin. Initial-sale revenue is sometimes supplemented by after-sale firmware-unlocked feature set updates, services, and other add-ons. But more often than not, a manufacturer’s path to ongoing fiscal stability involves straightforwardly selling you a brand-new replacement/upgrade unit down the road; cue obsolescence by design for the unit currently in your possession.
Which leads to my next topic…
Longevity
One of the products “showcased” in my August 2020 writeup didn’t meet its premature demise due to intentionally unfixed software bugs (as was the case for a conceptually similar product in Belkin’s Wemo line, several examples of which I owned when the exploit was announced). Instead, its early expiration was the result of an intentional termination of the associated “cloud” service done by its retail supplier, Best Buy (Connect WiFi Smart Plug shown above).
More recently, I told you about a similar situation (subsequently resolved positively via corporate buyout and resurrection, I’m happy to note) involving SmartLabs’ various Insteon-branded powerline networking products. Then there was the Spotify Car Thing, which I tore down in early 2023. And right before this year’s CES opened its doors to the masses, ironically, came yet another case study example of the ongoing disappointing trend: the $800 (nope, no refunds) Moxie “emotional support” robot, although open source (which, yes, I know I just critiqued earlier here) may yet come to the rescue for the target 5-10 year old demographic:
Government oversight to the rescue, again (?). Here’s a summary, from Slashdot’s highlight:
Nearly 89% of smart device manufacturers fail to disclose how long they will provide software updates for their products, a Federal Trade Commission staff study found this week. The review of 184 connected devices, including hearing aids, security cameras and door locks, revealed that 161 products lacked clear information about software support duration on their websites.
Basic internet searches failed to uncover this information for two-thirds of the devices. “Consumers stand to lose a lot of money if their smart products stop delivering the features they want,” said Samuel Levine, Director of the FTC’s Bureau of Consumer Protection. The agency warned that manufacturers’ failure to provide software update information for warranted products costing over $15 may violate the Magnuson Moss Warranty Act. The FTC also cautioned that companies could violate the FTC Act if they misrepresent product usability periods. The study excluded laptops, personal computers, tablets and automobiles from its review.
Repeating what I said earlier, I’m skeptical that this effort will be successful, despite my admiration for its honorable intentions. In no small part, my pessimism stems from recent US election results, given that Republicans have (historically, at least) been disproportionally pro-business to the detriment of consumer rights. That said, were the manufacturer phase-out to instead be the result of something other than the shutdown of a proprietary “cloud” service, such as (for example) a no-longer-maintained-therefore-viable (or at-all available, for that matter) proprietary application, the hardware might still be usable if it could alternatively be configured and controlled using industry-standard command and communications protocols.
Which leads to my next topic…
Interoperability
Those of you who read to the bitter end of my recently published “2024 look-back” tome might have noticed a bullet list of topics there that I’d originally also hoped to cover but eventually decided to save for later. The first topic on the list, “Matter and Thread’s misfires and lingering aspirations,” I held back not just because I was approaching truly ridiculous wordcount territory but also because I suspected I’d have another crack at it a short time later, at CES to be precise.
I was right; that time is now. Matter, for those of you not already aware, is:
…a freely available connectivity standard for smart home and IoT (Internet of Things) devices. It aims to improve interoperability and compatibility between different manufacturers and security, always allowing local control as an option.
And Thread? I thought you’d never ask. It’s:
…an IPv6-based, low-power mesh networking technology for Internet of things (IoT) products…
Often used as a transport for Matter (the combination being known as Matter over Thread), the protocol has seen increased use for connecting low-power and battery-operated smart-home devices.
Here’s what I wrote about Matter and Thread a year ago, in my 2024 CES discourse:
The Matter smart home communication standard, built on the foundation of the Thread (based on Zigbee) wireless protocol, had no shortage of associated press releases and product demos in Las Vegas this week. But to date, its implementation has been underwhelming (leading to a scathing but spot-on recent diatribe from The Verge, among other pieces), both in comparison to its backers’ rosy projections and its true potential.
Not that any of this was a surprise to me, alas. Consider that the fundamental premise of Matter and Thread was to unite the now-fragmented smart home device ecosystem exemplified by, for example, the various Belkin Wemo devices currently residing in my abode. If you’re an up-and-coming startup in the space, you love industry standards, because they lower your market-entry barriers versus larger, more established competitors. Conversely, if you’re one of those larger, more established suppliers, you love barriers to entry for your competitors.
Therefore the lukewarm-at-best (and more frequently, nonexistent or flat-out broken) embrace of Matter and Thread by legacy smart home technology and product suppliers (for which, to be precise, and as my earlier Blink example exemplifies, conventional web browser access, vs a proprietary app, is even a bridge too far)…Suffice it to say that I’m skeptical about Matter and Thread’s long-term prospects, albeit only cautiously so. I just don’t know what it might take to break the logjam that understandably prevents competitors from working together, in spite of the reality that a rising tide often does end up lifting all boats…or if you prefer, it’s often better to get a slice of a large pie versus the entirety of a much smaller pie.
A year later, is the situation better? Not really, candidly. For a more in-depth supplier-sourced perspective, I encourage you to read Aalyia’s coverage of her time spent last week in Silicon Labs’ product suite, including an interview with Daniel Cooley, CTO of the company. Cooley is spot-on when he notes that “it is not unusual for standards adoption to progress slower than desired.” I’ve seen this same scenario play out plenty of times in the past, and Matter and Thread (assuming they eventually achieve widespread success) won’t be the last. I’m reminded, for example, of a quote attributed to Bill Gates, that “We always overestimate the change that will occur in the next two years and underestimate the change that will occur in the next 10.”
Cooley is also spot-on when he notes that Matter and Thread don’t necessarily need to go together; the Matter connectivity standard can alternatively use Ethernet (either wireless, aka Wi-Fi, or wired) for transport, along with Bluetooth Low Energy for initial device setup purposes (and speaking of wireless smart home network protocols, by the way, a quick aside: check out Z-Wave’s just-announced long range enhancements). And granted, there has been at least progress with both Matter (in particular) and Thread over the past year.
Version 1.4 of the Matter specification, announced last November, promises (quoting from Ars Technica’s coverage) “more device types, improvements for working across ecosystems [editor note: a concept called “Enhanced Multi-Admin”], and tools for managing battery backups, solar panels, and heat pumps”, for example. And at CES, the Connectivity Standards Alliance (CSA), which runs Matter, announced that Apple, Google, and Samsung will accept its certification results for their various “Works With” programs, too. That said, Amazon is notably absent from the CSA’s fast-track certification list. And more generally, Ars Technica was spot-on with the title of its writeup, “Matter 1.4 has some solid ideas for the future home—now let’s see the support.” See you back here this same time next year?
The Rest of the Story
(no, I don’t know what ballet has to do with smart rings, either)
Speaking of “approaching truly ridiculous wordcount territory”, I passed through 2,000 words a couple of paragraphs back, so I’m going to strive to make the rest of this piece more concise. Looking again at the list of potential coverage technology and product topics I scribbled down a few days ago, partway through CES, and after subtracting out the “Matter and Thread” entry I just discussed, I find…16 candidates left. Let’s divide that in two, shall we? Without further ado, and in no particular order save for how they initially streamed out of my noggin:
- Smart glasses: Ray-Ban and Meta’s jointly developed second-generation smart glasses were one of the breakout consumer electronics hits of 2024, with good (initial experience, at least) reason. Their constantly evolving AI-driven capabilities are truly remarkable, on top of the first-generation’s foundational still and video image capture and audio playback support. Unsurprisingly, therefore, a diversity of smart glasses implementations in various function and price-point options, from numerous suppliers and in both nonfunctional mockup, prototype and already-in-production forms, populated 2025 CES public booths and private meeting rooms alike in abundance. I actually almost bought a pair of Ray-Ban Meta glasses during Amazon’s Black Friday…err…week-plus promotion to play around with for myself (and subsequently cover here at EDN, of course). But I decided to hold off for the inevitable barely-used (if at all) eBay-posting markdowns to come. Why? Well, the recent “publicity” stemming from the New Orleans tragedy didn’t help (and here I thought “glassholes” were bad). Even though Meta Ray-Ban offers product options with clear lenses, not just sunglasses, most folks don’t (and won’t) wear glasses all the time, not to mention that battery life limitations currently preclude doing so anyway (and don’t get me started on the embedded batteries’ inherent obsolescence by design). And when folks do wear them, they’re fashion statements. Multiple pairs for various outfits, moods, styles (invariably going in and out of fashion quickly) and the like are preferable, something that’s not fiscally feasible for the masses when the glasses cost several hundred dollars apiece.
- Smart rings: This wearable health product category is admittedly intriguing because unlike glasses (or watches, for that matter), rings are less obvious to others, therefore it’s less critical (IMHO, at least) for the wearer to perfectly match them with the rest of the ensemble…plus you have 10 options of where to wear one (that said, does anyone put a ring on their thumb?). There were quite a few smart rings at CES this year, and next year there’ll probably be more. Do me a favor; before you go further, please go read (but come back afterwards!) The Verge’s coverage of Ultrahuman’s Rare ring family (promo videos at the beginning of this section). The snark is priceless; it was the funniest piece of 2025 CES coverage I saw!
- HDMI: Version 2.2 is enroute, with higher bandwidth (96 Gbps) now supportive of 16K resolution displays (along with 4K displays at head-splitting 480 fps), among other enhancements. And there’s a new associated “Ultra96” cable, too. At first, I was a bit bummed when I heard this, due to the additional infrastructure investment that consumers will need to shoulder. But then I thought back to all the times I’d grabbed a random legacy cable out of my box o’HDMI goodies only to discover that, for example, it only supported 1080p resolution, not 4K…even though the next one I pulled out of the box, which looked just like its predecessor down to the exact same length, did 4K without breaking a sweat. And I decided that maybe making a break from HDMI’s imperfect-implementation past history wasn’t such a bad idea, after all…
- 3D spatial audio: Up to this point, Dolby’s pretty much had the 3D spatial audio (which expands—bad pun intended—beyond conventional surround sound to also encompass height) stage all to itself with Atmos, but on the eve of CES, Samsung unveiled the latest fruits of its partnership with Google to promulgate an open source alternative called IAMF, for Immersive Audio Model and Formats, now also known by its marketing moniker, “Eclipsa Audio”. In retrospect, this isn’t a terrible surprise; for high-end video, Samsung has already settled on HDR10+ versus Dolby Vision. But I have questions, specifically as to whether Google and Samsung are really going to be able to deliver something credible that doesn’t also collide with Dolby’s formidable patent portfolio. And I also gotta say that the fact that nobody at Samsung’s booth was able to answer one reporter’s questions doesn’t leave me with a great deal of early-days confidence.
- TVs: Speaking of video, I mentioned more than a decade ago that Chinese display manufacturers were beginning to “make serious hay” at South Korea competitors’ expense, much as those same South Korea-based companies had previously done to their Japanese competitors (that said, it sure was nice to see Panasonic’s displays back at CES!). To wit, TCS has become a particularly formidable presence in the TV market. While it and its competitors are increasingly using viewer-customized ads (logging and uniquely responding to the specific content you’re streaming at the time) and other smart TV “platform” revenue enhancements to counterbalance oft-unprofitable initial hardware prices, TCS takes it to the next level with remarkably bad AI-generated drivel shown on its own “free” (translation: advertising-rife) channel. No thanks, I’ll stick with reruns of The Office. That said, the on-the-fly auto-translation capabilities built into Samsung’s newest displays (along with several manufacturers’ earbuds and glasses) were way
- Qi: Good news/bad news on the wireless charging Bad news first: the Qi Consortium recently added the “Qi Ready” category to its Qi2 specification suite. What this means, simply stated, is that device manufacturers (notably, at least at the moment, of Android smartphones) no longer need to embed orientation-optimization magnets in the devices themselves. Instead, as I’m already doing with my Pixel phones, they can alternatively rely on magnets embedded in accompanying cases. On the one hand, as Apple’s MagSafe ecosystem already shows, if you put a case on a phone it needs to have magnets anyway, because the ones in the phone aren’t strong enough to work through the added intermediary case material. And—I dunno—maybe the magnets add notable bill-of-materials cost? Or they interfere with the phone’s speakers, microphones and the like? Or…more likely (cynically, at least), the phone manufacturers see branded cases-with-magnets as a lucrative upside revenue streams? Thoughts, readers? Now for the good news: auto-movable coils to optimize device orientation! How cool is that?
- Lithium battery-based storage systems: Leading suppliers are aggressively expanding beyond portable devices into full-blown home backup systems. EcoFlow’s monitoring and management software looks quite compelling, for example, although I think I’ll skip the solar cell-inclusive hat. And Jackery’s now also selling solar cell-augmented roof tiles.
- Last but not least: (the) RadioShack (licensed brand name, to be precise) is back, baby!
And, now well past 3,000 words, I’m putting this one to bed, saving discussions on robots, Wi-Fi standards evolutions, full-body scanning mirrors with cameras (!!), the latest chips, inevitable “AI” crap and the like for another day. I’ll close with iFixit’s annual “worst of show” coverage:
And with that, I look forward to your thoughts on the things I discussed, saved for later and overlooked alike in the comments!
—Brian Dipert is the Editor-in-Chief of the Edge AI and Vision Alliance, and a Senior Analyst at BDTI and Editor-in-Chief of InsideDSP, the company’s online newsletter.
Related Content
- CES 2025 coverage
- IoT device vulnerabilities are on the rise
- Routers infected with malware: Owners (and manufacturers) beware
- Disassembling a Cloud-compromised NAS
- 2025: A technology forecast for the year ahead
- A Bluetooth receiver, an identity deceiver
- Open Source: Keep It Current Or Suffer The Consequences
- Heartbleed: the wakeup call the open-source community needed?
- Obsolescence by design, defect, or corporate decree
The post The 2025 CES: Safety, Longevity and Interoperability Remain a Mess appeared first on EDN.
CEA-Leti presenting at Photonics West, including Invited Paper on optical phased arrays for LiDAR
LED Meaning, Types, Working, Applications, Uses & Advantages
LED stands for Light Emitting Diode. It is a semiconductor component that transforms electrical energy into light through the process of electroluminescence.
Types of LED
- Standard LEDs: Basic LEDs used in indicators, displays, and signaling.
- High-Power LEDs: Brighter and used in floodlights, automotive headlights, and streetlights.
- RGB LEDs: Red, Green, and Blue LEDs that can produce a range of colours.
- COB LEDs (Chip on Board): Multiple LED chips mounted on a single circuit board for uniform light distribution.
- SMD LEDs (Surface Mounted Diodes): Compact and efficient for general-purpose lighting.
- Filament LEDs: Designed to resemble traditional incandescent bulbs with modern LED technology.
How Does LED Work?
- Semiconductor Material: LEDs use a semiconductor made of materials like gallium arsenide or gallium nitride.
- Electroluminescence: When electrical current flows through the semiconductor, it excites electrons, causing them to release energy in the form of photons (light).
- Phosphor Coating: For white light, blue LEDs are coated with phosphor materials to convert the blue light into white light.
LED Applications
- Residential Lighting: General lighting, ceiling lights, table lamps.
- Commercial Lighting: Offices, retail stores, and large venues.
- Street Lighting: Energy-efficient public illumination.
- Automotive Lighting: Headlights, brake lights, interior lights.
- Displays: TVs, computer monitors, and digital billboards.
- Signage: Outdoor and indoor advertising displays.
- Medical Equipment: Surgical lights, diagnostic tools.
LED Advantages
- Energy Efficiency: Uses up to 80% less energy than incandescent bulbs.
- Long Lifespan: Can last 25,000 to 50,000 hours or more.
- Durability: Resists shocks, vibrations, and extreme temperatures.
- Eco-Friendly: Free of toxic materials like mercury.
- Instant Lighting: Lights up immediately without warm-up time.
- Dimmable: Many LEDs can be adjusted for brightness.
- Design Flexibility: Available in various shapes, colors, and sizes.
LED Disadvantages
- Higher Initial Cost: More expensive upfront compared to traditional lighting.
- Heat Sensitivity: Requires proper heat dissipation to maintain performance.
- Color Accuracy: Lower-quality LEDs may have poor color rendering.
- Blue Light Emission: Excessive blue light exposure may cause discomfort or disrupt sleep.
- Compatibility Issues: Some older fixtures or dimmers may not work well with LEDs.
The post LED Meaning, Types, Working, Applications, Uses & Advantages appeared first on ELE Times.
LED Lighting Definition, Types, Applications and Benefits
LED (Light Emitting Diode) lighting is a lighting technology that utilizes semiconductors to transform electrical energy into visible light. LEDs are highly efficient, durable, and versatile, making them suitable for a wide range of applications, from home lighting to industrial and automotive use.
History of LED Lighting
- 1907: H.J. Round first observed electroluminescence in silicon carbide, which became a foundational discovery for the development of LED technology.
- 1962: Nick Holonyak Jr., working at General Electric, created the first visible-spectrum LED (red).
- 1970s: LED technology expanded with additional colors like green and yellow, though applications were limited to indicators and displays.
- 1990s: Blue LEDs were developed by Shuji Nakamura, enabling the creation of white LEDs by combining blue light with phosphor coatings.
- 2000s: LEDs began to replace traditional incandescent and fluorescent lighting in many applications due to advances in efficiency, color rendering, and cost.
- Today: LEDs dominate the lighting industry with widespread applications, from smart home systems to streetlights and displays.
Types of LED Lighting
- Miniature LEDs
- Used in indicators, displays, and small electronics.
- High-Power LEDs
- Brighter and used in high-intensity applications like floodlights and automotive headlights.
- RGB LEDs
- Combine red, green, and blue LEDs to produce various colors; used in displays and decorative lighting.
- COB LEDs (Chip on Board)
- Provide high brightness and even light distribution; common in spotlights and downlights.
- SMD LEDs (Surface-Mounted Diodes)
- Compact and versatile; widely used in strip lighting and general-purpose lighting.
- Filament LEDs
- Mimic traditional filament bulbs; used for decorative lighting.
How Does LED Lighting Work?
- Semiconductor Materials: LEDs use a semiconductor (typically gallium arsenide or gallium nitride).
- Electric Current: When electricity flows through the diode, electrons combine with holes in the semiconductor material, releasing energy in the form of photons (light).
- Phosphor Coating: For white light, a blue LED is coated with a phosphor material to convert blue light into white light.
Applications of LED Lighting
- Residential: General lighting, decorative lighting, and smart home systems.
- Commercial: Office spaces, retail displays, and signage.
- Industrial: Factory lighting, warehouse illumination, and hazardous environments.
- Automotive: Headlights, interior lighting, and brake lights,
- Street Lighting: Energy-efficient public lighting systems.
- Displays: TVs, monitors, and large digital billboards.
- Medical: Surgical lighting and diagnostic devices.
How to Use LED Lighting
- Select the Right Type: Choose LEDs based on brightness (lumens), color temperature (warm, cool, or daylight), and beam angle.
- Install Proper Fixtures: Use fixtures designed for LEDs to ensure optimal performance and longevity.
- Control Options: Utilize dimmers, smart systems, or RGB controllers for customized lighting.
- Placement: Position LEDs effectively to reduce glare and enhance the desired ambiance.
Advantages of LED Lighting
- Energy Efficiency: LEDs consume up to 80% less power compared to traditional incandescent bulbs.
- Long Lifespan: Can last 25,000–50,000 hours, significantly longer than traditional lighting.
- Durability: Resistant to shocks, vibrations, and extreme temperatures.
- Eco-Friendly: Contains no toxic materials like mercury and emits less heat.
- Design Flexibility: Available in various shapes, colours, and sizes.
- Instant Illumination: LEDs turn on immediately without any warm-up period.
- Dimmable and Controllable: Many LEDs support dimming and integration into smart lighting systems.
Disadvantages of LED Lighting
- Higher Upfront Cost: LEDs are more expensive initially compared to traditional lighting.
- Heat Sensitivity: Performance can degrade if not properly cooled.
- Color Rendering: Some cheaper LEDs may have lower color rendering accuracy.
- Blue Light Concerns: Excessive blue light exposure from LEDs may cause eye strain or disrupt sleep cycles.
- Compatibility Issues: May not work well with older dimmers or fixtures without modifications.
The post LED Lighting Definition, Types, Applications and Benefits appeared first on ELE Times.
Troubleshooting Flowchart from Practical Electronics for Inventors. What would you add? Is this a good guide?
![]() | submitted by /u/SkunkaMunka [link] [comments] |
Сторінки
