ELE Times

Subscribe to ELE Times feed ELE Times
latest product and technology information from electronics companies in India
Updated: 2 hours 5 min ago

Zoho Allocates $200 Million for Semiconductor Fab, Seeks Licence Approval

Wed, 06/19/2024 - 14:27

Chennai-based Zoho Corp. has applied for a licence to build a compound semiconductor fabrication plant, committing $200 million to the project. According to two senior industry officials, this compound fab project is expected to attract around $600 million in net investment and will involve a technology partner.

Sridhar Vembu, co-founder, and chief executive of Zoho Corp, announced at the company’s annual event in Texas, US, that the license application has been submitted, and they are currently awaiting government clearance. A formal announcement will follow upon approval. Vembu added that Zoho is hopeful the government will co-finance the project under the compound semiconductor manufacturing scheme.

One of the officials disclosed that Zoho’s fab project aims to bring Silicon carbide (SiC) chip manufacturing to Tamil Nadu, with plans to license the technology from Clas-SiC, a Scotland-based company.

Subsidies for compound semiconductor fabs are part of the Indian government’s initiative to develop the semiconductor and display manufacturing ecosystem, with guidelines revised in June 2023. Under this scheme, applicants for a fab can receive fiscal support covering 50% of their capital expenditure if they meet a specified capex threshold.

Silicon carbide chips are used in various industrial applications, including electric vehicles (EVs). In October, Vembu revealed that Zoho is exploring investments in advanced chemistry cells, the core components of batteries used in electronics and EVs.

“This space is under exploration, and discussions are underway with some stakeholders. While it’s premature for an announcement, there is a requirement to develop deep tech and manufacturing capabilities in India. The advanced cell chemistry production-linked incentive (PLI) schemes also present substantial potential. Individuals with experience in cell R&D are being sought, and discussions with some candidates are presently ongoing”, Vembu said.

The two initiatives are likely linked, the second official mentioned.

The post Zoho Allocates $200 Million for Semiconductor Fab, Seeks Licence Approval appeared first on ELE Times.

India’s Electronic Manufacturing Set to Double in Five Years: Government Sources

Wed, 06/19/2024 - 14:20

According to sources from the Ministry of Electronics and Information Technology, India’s electronic manufacturing sector is projected to double within the next five years.

The ministry source stated that the country’s electronic manufacturing is expected to reach approximately USD 250 billion over this period, a significant increase from the current export value of USD 125-130 billion.

In a bid to tackle unemployment, the government aims to double the number of jobs in the electronic manufacturing sector from the current 2.5 million to around 5 million in the next five years.

“Our focus remains on providing services to digital technology and scaling up large-scale electronics manufacturing. These goals are steadfast and will only accelerate,” said Ashwini Vaishnaw, Minister for Electronics and Information Technology.

The sources also noted that India is moving towards self-reliance (Atmanirbhar) in certain segments like mobile phones, transitioning from import substitution to becoming an export-driven manufacturer. For laptops, India is currently in the process of achieving self-reliance.

The Indian government has allocated Rs 760 billion through various incentive schemes to boost electronic manufacturing in the country. Despite India’s per capita electronic consumption being one-fourth of the global average, significant strides are being made.

In terms of imports, China and Hong Kong dominate, contributing 44% and 16% respectively to India’s total electronic imports. On the export front, mobile phones and Electronic Control Units (ECUs) lead, with the United States and UAE being the primary destinations, absorbing a substantial portion of India’s electronic exports.

Experts highlight that India’s electronic manufacturing sector is transforming, positioning the nation as a burgeoning global electronics manufacturing hub, which will drive economic growth, job creation, and technological innovation.

The post India’s Electronic Manufacturing Set to Double in Five Years: Government Sources appeared first on ELE Times.

Anti-Dumping Duty on PCBs Challenges IT Hardware PLI Beneficiaries

Wed, 06/19/2024 - 14:03

The imposition of a 30% anti-dumping duty (ADD) on bare printed-circuit boards (PCBs) is impacting companies benefiting from India’s production-linked incentive (PLI) scheme for IT hardware. This ADD, applied in response to complaints from six local PCB manufacturers about cheap imports from China and Hong Kong, is increasing production costs and reducing global competitiveness for domestically manufactured products.

Industry executives report that the ADD raises production costs by 3-4% for lighting products, 1% for IT hardware, and 2-3% for telecom products. This cost increase is particularly burdensome for firms engaged in domestic PCB assembly, a requirement under the revised IT hardware PLI scheme, which necessitates local component sourcing and PCB assembly (PCBA) within the first year of operation.

Some 27 companies, including major players like Dell, HP, Dixon Technologies, Lava, Foxconn, Lenovo, and Optiemus, have committed to manufacturing IT hardware under the PLI scheme, which has a Rs 17,000-crore outlay. Despite incentives for local PCBA, the ADD on bare PCBs poses a significant challenge.

However, the duty exempts mobile phone manufacturers, PCBs with more than six layers, flex PCBs, and other complex designs. This selective application of the ADD highlights disparities in how the policy affects different sectors within the electronics industry.

The ADD aims to protect domestic PCB manufacturers, but it also creates hurdles for companies trying to meet the PLI scheme’s localisation requirements while maintaining global competitiveness. Balancing these objectives remains a complex challenge for policymakers and industry stakeholders.

The post Anti-Dumping Duty on PCBs Challenges IT Hardware PLI Beneficiaries appeared first on ELE Times.

Rohde & Schwarz leads in device conformance for 3GPP Mission Critical Services, trailblazing migration to broadband

Wed, 06/19/2024 - 10:10

Rohde & Schwarz has successfully validated missioncritical push-to-talk protocol conformance test cases for the Global Certification Forum (GCF) and achieved Test Platform Approval Criteria (TPAC) for its R&S TS‑PCT protocol conformance tester. Rohde & Schwarz is the first and only test and measurement company to activate the initial missioncritical push‑to‑talk work item. With this achievement, the company is paving the way for a smooth transition from narrowband TETRA to broadband 3GPP Mission Critical Services for device manufacturers and service providers.

Rohde & Schwarz pioneered the verification of missioncritical push‑to‑talk (MCPTT) test cases, enabling the GCF to launch a first-of-its-kind certification program for Mission Critical Services (MCS). With this program public safety network operators can be assured that missioncritical applications and devices are interoperable thanks to a common standard for vendors.

3GPP MCS refers to a suite of communications services that are essential for public safety and emergency response operations. It ensures seamless communications for first responders, enabling efficient coordination and rapid response in critical situations. Often referred to as MCX, these missioncritical services, which include not only MCPTT but also missioncritical data (MCData) and missioncritical video (MCVideo), meet stringent requirements for reliability, availability and security. The transition to 3GPP-compliant broadband MCX provides enhanced capabilities over traditional narrowband systems, offering higher data rates, improved coverage and the ability to support a wide range of multimedia applications.

3GPP MCS applications and devices must be certified by test laboratories accredited by certification bodies such as GCF, PTCRB or CTIA before they can be launched on the market. At the last meeting of the Conformance Agreement Group (CAG) #78bis, Rohde & Schwarz validated the 3GPP MCS test cases of Work Item 288, using the R&S TS-PCT protocol conformance tester, which is based on the R&S CMX500 radio communication tester and the R&S CONTEST software running the test sequences. This success enabled the GCF to activate the work item and officially start conformance testing for MCS devices.

The protocol conformance test cases were developed using the R&S CMX500 radio communication tester and allow engineers to evaluate their MCS applications and devices under realistic conditions. The device under test establishes a real‑time, comprehensive connection to the simulated MCS network and tests the relevant signaling and RF scenarios in line with 3GPP specifications.

For more information about 3GPP conformance test solutions from Rohde & Schwarz, visit: https://www.rohdeschwarz.com/products/test-and-measurement/conformance-test-systems-3gpp-ctia_91815.html

The post Rohde & Schwarz leads in device conformance for 3GPP Mission Critical Services, trailblazing migration to broadband appeared first on ELE Times.

Maximising power delivery: 240 W USB PD reference design solutions with highest power capabilities

Mon, 06/17/2024 - 14:37

The USB Power Delivery (PD) standard continues to evolve. The most recent USB PD 3.2 specification now supports sourcing and sinking up to 240 W (48 V, 5 A). This article will discuss the main trends in USB PD and explore why implementing USB-C in 2024 is no longer considered an innovation, but a good approach to staying competitive with new designs. Finally, the article will provide an overview of a source and sink solution that supports 240 W using Infineon’s primary- and secondary-side controllers, EZ-PD™ PMG1 high voltage microcontrollers with integrated USB-C PD and Infineon’s CoolGaN™ power transistors.

USB is currently a well-recognised set of standards in consumer and automotive applications and is rapidly expanding into the industrial and e-mobility sectors. Starting with the first official USB 1.0 revision in 1996, which supported up to 1.5 Mbps data transfer and up to 2.5 W of power, it has evolved to support an incredible 80 Gbps and 240 W through a single USB-C cable. Nevertheless, about half of all of Arrow’s booth visitors at a recent show, were surprised to learn that USB standards already support 240 W.

USB Type-C connector

Let’s dive deeper into the details of the USB Type-C standard connector. The big advantage of this connector is that the interface is plug-agnostic due to its symmetrical design – it functions regardless of which way it is connected. This connector supports power delivery up to 240 W and accommodates protocols like HDMI, DisplayPort, Thunderbolt, and others through its Alternate Mode, known as Alt Mode. These features together have had significant influence on the industry.

Looking at the connector pins, shown in Figure 1, the CC1 and CC2 pins – known as cable configuration pins – are used to control the orientation and determine their roles: Downstream Facing Port (DFP) for the source power role, Upstream Facing Port (UFP) for the sink role, and Dual-Role Port (DRP) for both source and sink roles. Another function of the CC pins is to facilitate data transfer when negotiating Power Delivery contracts between the source and sink.

Four differential pairs labelled RX/TX for USB 5 Gbps or higher speeds, were used starting with USB 3.1. These lines operate in full duplex mode, while the two pairs of legacy pins, D+/D-, located in the centre, operate in half duplex mode. Pins labelled as SBU, or SideBand Use, are utilised for Alt Modes, such as enabling video output for DisplayPort or Thunderbolt. The VBUS pins are used to deliver power up to 240 W. All four external ground (GND) pins are intended for grounding. Proper grounding is essential for USB-C at high data rates and in cases of high-power delivery over the connector.

USB Power Delivery specification

Taking a deep dive into the USB Power Delivery standard, as mentioned earlier, the initial versions of the USB standard allowed the maximum power of 2.5 W (5 V@500mA) over VBUS. With USB 3.0, this increased slightly to 4.5 W (5 V@900mA), but it was still not sufficient for many applications.

 USB PD 3.2 extended power rangeFigure 2: USB PD 3.2 extended power range

With the introduction of USB Type-C and Power Delivery in 2014, the USB power capabilities were significantly increased. By default, with the means of the USB Type-C, the maximum power available increased to 15 W (5V@3A). With the full implementation of the USB PD, it became possible to deliver up to 100 W (20V@5A) from a single USB source.

Starting from the USB Power Delivery 3.1, the specification allows for up to 240 W (48V@5A). All voltage levels higher than 20 V are now categorised as Extended Power Range (EPR). The specification also includes support for an Adjustable Voltage Supply (AVS) mode, which allows the voltage to be adjusted in steps of 100 mV for voltages above 15 V.

Market trends evolution

USB is an almost 30-year-old standard, but the most significant improvements in speed and power capabilities were observed just recently. Over the last eight years, USB-C has been adopted by mainstream device manufacturers, such as laptop and smartphone makers.

Now, most laptops feature at least one USB-C port. Additionally, many other mobile devices have switched to USB-C, alongside a noticeable increase in USB-C power adapters on the market. Simultaneously, automotive manufacturers have added more USB-C ports in their vehicles to offer enhanced charging options for drivers and passengers. Following this trend, the adoption of USB-C in the embedded and industrial sectors is expected to increase significantly in the next few years.

The vision for 2025 and beyond is that many electronic devices, currently powered at up to 240 W, will adopt USB-C as a standard port for data communications and charging.

 Main market trends – many electronic devices will adopt USB-C for data communications and charging at up to 240 WFigure 3: Main market trends – many electronic devices will adopt USB-C for data communications and charging at up to 240 W

Among the key factors further driving USB adoption are the race in battery technologies, the shift to industrial applications, and the increasing worldwide acceptance of USB. Local regulations will also have a significant impact: USB Type-C and USB PD will become mandatory across all European Union countries for many applications by the end of 2024. Similar discussions are currently underway in the USA and other countries.

Following these trends nowadays isn’t just about staying at the cutting edge of technology — it has become an important competitive feature that secures a position in the market.      But what are the main advantages of the technology driving such rapid adoption of the standard in recent years?

Key technology drivers

Combined data and power capability: one of the most important drivers of USB-C’s fast growth is its ability to transmit high-speed data, video, and power through a single, slim connector. This feature enables the design of smaller and thinner devices.

Unification and reuse advantage: Traditional power adapters come with fixed voltage and current levels and often feature unique barrel connectors, which means they can only be used with their intended device. Such adapters are often not compatible with other devices.

In contrast, USB-C power adapters are universally compatible, offering a USB Type-C connector that works across many devices. Moreover, with the 240 W USB PD adapter, it is possible to negotiate voltages and currents up to 240 W (48      V     @5      A). This enables charging any USB device up to 240 W, including 5 V or 15 V devices, 45 W phones, and 160 W laptops. Even your future devices, e.g. power tool, e-Bike or consumer 3D printer could be potentially charged using the same adapter. A single USB power adapter can be used to charge many devices, thereby saving costs for consumers.

Saving R&D and manufacturing costs: customized power supplies, whether integrated or using a customized connector, often require additional investments in design, manufacturing, and testing, especially when production quantities are not very high.

Adoption of USB-C power adapter instead of a custom power supply along with appropriate Infineon USB-C PD controller to sink power will help to decrease the cost per watt ratio. This will naturally happen due to the huge competition among chargers’ manufacturers on the mass market.

Improving time-to-market: in addition to the higher costs for the external or embedded charger design, customized solutions often require much more time for R&D and further tests. While the simple sink USB implementation along with the certified USB charger enables faster time to market in most of application cases.

Reducing dependence from OEMs: another situation arises when the power supply or integrated AC/DC power module comes from an OEM manufacturer. In this case, the future of the project may highly depend on the third-     party’s manufacturing plans. Once the USB sink function is implemented, all certified USB adapters with similar power capabilities from the consumer market could be used, thus reducing dependence on a single OEM supplier.

E-waste reduction: at first glance, it may not be evident, but the unification of chargers has a significant impact on environmental protection. While each individual charger is small, the environmental consequences are substantial when considering the millions of chargers discarded each year.

Improving brand perception: the fact that the design company keeps up with trends always adds additional appeal to your potential customers. Imagine how you would feel if your company bought modern and expensive measurement equipment that came with a CD for installing drivers and the software tools.

USB Type-C connector:  besides all the USB driving factors and advantages mentioned previously, the most important feature itself was the introduction of the 24-pin USB Type-C connector in 2014 (see Figure 1).

240 W USB PD sink reference design

Following the latest trends in the industry, Arrow and Infineon introduced the new 240 W PD 3.1 sink reference design using Infineon’s EZ-PD™ PMG1-S3 high voltage microcontroller intended to support high power USB applications. The new reference design supports up to 48V@5. A Power Delivery Object (PDO) sink mode, which is the highest level achievable with the latest USB PD standards.

The reference design further extends the existing power sink capabilities of Infineon’s EZ-PD™ PMG1 family of high-voltage microcontrollers from 140 W to 240 W, which is important for many power-demanding and fast-charging applications.

 REF_ARIF240WS3 sink reference design boardFigure 4: REF_ARIF240WS3 sink reference design board

This 240 W sink reference design board REF_ARIF240WS3, shown in Figure 4, complements the recently released Infineon’s USB PD 3.1 source evaluation board REF_XDPS2222_240W1. This combination enables engineers to be among the first on the market with a complete 240 W USB PD 3.1 source-sink solution.

Infineon’s 240 W USB PD source reference design

The complementary design for 240 W power source, Infineon’s REF_XDPS2222_240W1 based on CoolGaN™ technology, is a high efficiency form-factor USB PD 3.1 reference design, with 25 W/in³ power density, using the XDP™ digital power XDPS2222 PFC + hybrid flyback (HFB) combo IC (Figure 5).

REF_XDPS2222_240W1 Source reference design boardREF_XDPS2222_240W1 Source reference design board

The internal handshaking between the PFC and HFB controller and the adaptive bus voltage setting make the XDP™ XDPS2222 controller perfect for applications with wide AC input and very wide output voltage range, such as USB PD extended power range (EPR) adapters and battery chargers. The main features, include HFB ZVS high- and low-side operation, fast response HFB peak current control, harmonised PFC and HFB operations, pulse skipping at light loads, automatic PFC enable/disable control, and self-adjusting PFC bus voltage level.

These reference designs are intended to support a broad range of applications, like light electric vehicles (e-bikes, e-scooters and other personal e-mobility devices), drones, mobile robots, 3D printers, professional AV equipment, power tools, medical equipment, home appliances, home entertainment devices, and more.

But generally, any application requiring 0 to 240 W could benefit from adopting USB-C for power, and the outstanding features of USB PD provided by these advanced designs.

Supporting services

Both reference designs are available on request. In addition to technical support, Arrow offers a range of technical support services, including schematic customisation and PCB modifications, helping to maximise the potential of customers’ designs and reduce time-to-market. Scan the below QR Code, for more information.

The post Maximising power delivery: 240 W USB PD reference design solutions with highest power capabilities appeared first on ELE Times.

AEKD-TRUNKL1: one power liftgate demo can transform how engineers think about zonal architecture

Mon, 06/17/2024 - 14:37

Author: STMicroelectronics

The power liftgate built on the AEKD-TRUNKL1 is always a highly popular and easily recognizable demo, as its loud beeps alert attendees that the trunk is in motion. However, behind the scenes, there is a lot more than “meets the ears.” For one thing, while the application focuses on a powered liftgate, the numerous boards and control systems involved can govern the entire rear. Consequently, despite appearances, this isn’t a demo about an application but how engineers can create a zonal architecture. Additionally, the presentation uses cost-effective sensors that many would never have thought to employ in an automotive, leading many integrators to solve detection issues and make their solution more accessible.

AEKD-TRUNKL1 or the creation of a zonal architecture Witness

The most visible aspect of the demo is the comprehensive nature of the AEKD-TRUNKL1 platform. As the video above shows, there are eight boards involved. Prima facie, users witness the AEK-LED-21DISM1 board as bright lights blink during operation or when sending an alert. Too often, engineers forget that LEDs are a part of the user interface that can significantly enhance the experience. Additionally, attendees hear the acoustic vehicle alerting system (AVAS) handled by the AEK-AUD-C1D9031 board that generates loud beeps thanks to its class D audio amplifier. In fact, many know that the demo is on the show floor before they even see it.

Interact The AEKD-TRUNKL1 demo at an ST Technology TourThe AEKD-TRUNKL1 demo at an ST Technology Tour

When users come to interact with the demo, they wave a hand underneath the fake car bumper. It’s at this moment that the time-of-flight sensors of the AEK-SNS-2TOFM1 detect movements and trigger the trunk’s opening or closing. Attendees can also control the application by using the Bluetooth signal from a mobile app that talks to the AEK-COM-BLEV1 Bluetooth communication board. Alternatively, there’s a contactless car key system thanks to the X-NUCLEO-NFC06A1 NFC reader. The AEK-MOT-TK200G1 then controls the motors responsible for moving the trunk gate. Attendees also see the AEK-LCD-DT028V1 display messages that can help developers debug code or showcase features.

Learn

Behind the scenes, the application runs on the host SPC58EC80E5 MCU present in the AEK-MCU-C4MLIT1. Our engineers also added features like movement detection, which stops the motors immediately if the vehicle is in motion. This is possible thanks to the AEK-CON-SENSOR1 sensor board, which houses an inertial MEMS on its DIL 24 socket. Finally, the whole application is available in AutoDevKit Studio to help developers run it rapidly and learn from our implementation. Teams would have to contact their ST representatives as we do not provide the entire demo kit for sale. However, almost all boards are available for purchase, and users can download the AutoDevKit Studio IDE to get started.

Indeed, if one thing shines through this demo, it is that it is bigger than the sum of its parts. The Bluetooth or NFC system can control other features. The AVAS board can alert users of other dangers, and the motor control application can power windows. Put simply, it is time to rethink car applications by framing them in a holistic methodology. It’s easy to break every feature into a subsystem. However, by looking at the back of the vehicle as a whole, combining functionalities onto the same host MCU, and reusing devices, we simplify developments and significantly reduce costs, thus providing more features to mainstream vehicles.

Obviously, ST does not advocate combining all subsystems into one computer. In fact, the benefit of a zonal architecture is its inherent decentralization. For instance, a zonal architecture means that the distance between the sensors and the ECU is far shorter, thus greatly simplifying designs and improving reliability. By using a zonal approach instead of a domain-based architecture, it’s also possible to implement advanced features faster since there are fewer risks of a bug causing serious issues, thus hastening developments and lowering costs. It also makes maintenance operations easier since technicians can focus on a particular zone to more quickly isolate issues compared to scanning the entire car’s software for problems.

On the other hand, installing an ECU to handle only one or two real-time operations at a time can be cumbersome and costly. The AEKD-TRUNKL1 power liftgate demo is thus an object lesson in finding the right balance by adopting a zonal architecture that can make advanced powered liftgate more cost-effective and thus more accessible to entry-level and mainstream vehicles. As MCUs increasingly become powerhouses while keeping consumption down, developers must better utilize their capabilities to combine more functionalities under one ECU. A zonal architecture can thus help bring features currently only present in higher-end models to transform a vehicle’s value proposition.

Power liftgate or the evolution of sensors in cars Learn A greater number of sensors help with the user experienceA greater number of sensors help with the user experience

Traditionally, car makers favor sonar to detect the movements under the car that trigger the power liftgate. The issue is that these devices are expensive and not always practical. For instance, accuracy on early models was notoriously low, which forced car makers to publish videos teaching their customers how to “kick” under their bumper to activate the liftgate. Our teams solved this challenge by using multiple time-of-flight sensors instead of sonars. Despite using more devices, the bill of materials is still significantly more cost-effective. Additionally, the system can track a greater range of movements. Users don’t have to learn a precise leg move. Instead, they can perform more natural movements with greater confidence.

Interact

Using time-of-flight sensors on the outside of the car or the inertial MEMS detecting if the vehicle is moving demonstrates that an automotive solution can use more classical and cost-effective devices. Too often, manufacturers use costly devices when sensing the outside world. However, the ST power liftgate demo shows that it is possible to get even better results at a fraction of the cost and with devices that are far easier to source. Put simply, it challenges common assumptions holding the industry back by showing that a device like a time-of-flight or inertial sensor can bring new features to more mainstream vehicles.

Witness

Furthermore, as more sensors embed greater intelligence, like a finite state machine, a DSP, or a machine learning core, makers are witnessing the advent of AI at the edge. There will always be cameras and image recognition software that analyze the road and the car’s surroundings. However, the AEKD-TRUNKL1 demo shows a rise in simpler sensors that provide information to various zones and can even offer edge processing. Consequently, an engineer could install a simpler and more cost-effective communication interface since there will be less data traveling. In a nutshell, the time-of-flight and inertial sensors of the ST power liftgate demo are a sign of new trends shaping the industry.

The post AEKD-TRUNKL1: one power liftgate demo can transform how engineers think about zonal architecture appeared first on ELE Times.

Choosing the Right Solution for Industrial Automation: AGVs vs. AMRs

Fri, 06/14/2024 - 14:25

In the realm of industrial automation, mobile Automated Guided Vehicles (AGVs) and Autonomous Mobile Robots (AMRs) are indispensable in enhancing efficiency by transporting materials within factories and warehouses. Both AGVs and AMRs share fundamental attributes and perform similar automation tasks, but their key differences in navigation capabilities and obstacle avoidance set them apart.

AGVs: Precision and Predictability

AGVs operate by following predetermined routes, and they do not deviate from these paths. When an AGV encounters an obstacle, it stops and waits until the obstacle is removed. AGVs employ various sensing techniques to detect their routes:

  1. Magnetic Tape: Sensors under the AGV detect magnetic tape on the factory floor and adjust the vehicle’s position accordingly. Additional magnetic tape can encode specific locations.
  2. Inductive Wire: Embedded wires in the floor guide the AGV, with sensors detecting the wire to maintain the correct path.
  3. Visual Tracking: Coloured tapes or markers like AprilTags are placed on the ground and detected by RGB cameras to map the route and determine location.
  4. Laser Guidance: A 360° laser on the AGV interacts with reflectors installed in the facility, measuring distances and angles to triangulate the AGV’s position.
AMRs: Flexibility and Intelligence

AMRs, on the other hand, utilize Simultaneous Localization and Mapping (SLAM) to navigate autonomously. Equipped with depth sensors, typically Lidar scanners, AMRs create and store detailed maps of their environment. This process involves initially driving the AMR around the facility to accumulate scans and generate a complete map. The map is then stored on the AMR and in fleet management control software and can be enhanced with additional data such as keep-out zones, speed reduction areas, and docking station locations. Goals are set on the map as coordinates, which the AMR navigates between by continuously comparing real-time scans with the stored map and adjusting its route to avoid obstacles.

Advantages and Disadvantages of AGVs

The main advantage of AGVs is their ability to follow a predetermined route with high precision and consistency, making them ideal for high-volume, repetitive tasks. However, AGVs require substantial infrastructural changes, such as installing and maintaining tapes, wires, or reflectors. These components are susceptible to wear and tear, and any necessary adjustments can significantly disrupt operations. Furthermore, AGVs cannot navigate around obstacles autonomously, which can lead to operational downtime.

Advantages and Disadvantages of AMRs

While AMRs have a higher initial cost and are less predictable in terms of exact travel time compared to AGVs, their advantages are substantial. AMRs do not require any infrastructural changes, as they can localize and navigate without markers. Updates to tasks and goals can be made quickly through software, and facility expansions or automation upgrades can be accommodated by generating new maps or extending existing ones.

In dynamic environments, the ease of use, flexibility, and scalability of AMRs provide a clear advantage. Their enhanced sensing capabilities, including longer-range Lidar, 3D depth sensing, radar, and RGB vision technologies, combined with superior computing power and artificial intelligence, enable advanced features and improved human-robot interaction.

Conclusion

In the evolving landscape of industrial automation, the choice between AGVs and AMRs hinges on specific operational needs. AGVs excel in environments requiring high precision and repetitive tasks but come with significant infrastructure and maintenance demands. AMRs offer unparalleled flexibility and adaptability, making them suitable for dynamic and rapidly changing environments. Ultimately, the decision should be based on the specific requirements and future scalability of the automation tasks at hand.

The post Choosing the Right Solution for Industrial Automation: AGVs vs. AMRs appeared first on ELE Times.

Breakthrough in Computer Vision Speeds Up Screening of Electronic Materials

Fri, 06/14/2024 - 13:36

A new computer vision technique developed by MIT engineers is set to revolutionize the screening process for electronic materials, significantly speeding up the characterization phase, which has been a major bottleneck. This innovation could dramatically boost the development of high-performance materials for solar cells, transistors, LEDs, and batteries.

Key Advancements
  1. Speed and Efficiency: The new method characterizes electronic properties of materials 85 times faster than conventional methods. This is achieved through the use of computer vision algorithms that analyze images of printed semiconducting samples.
  2. Key Properties Analyzed: The technique estimates two critical electronic properties:
  • Band Gap: The energy required to activate electrons.
  • Stability: The longevity of the material under various conditions.
  1. Automation and Integration: The technique is designed to be part of a fully automated materials screening system, potentially leading to an autonomous lab setup. This system would continuously make and test new materials based on AI predictions, operating 24/7 until the optimal material is discovered.
  2. Applications and Benefits: The technique can be applied across various fields, including solar energy, transparent electronics, and advanced transistors. It leverages the richness of hyperspectral imaging data, processed by sophisticated algorithms, to quickly and accurately determine material properties.
Detailed Process

Hyperspectral Imaging: Unlike standard cameras, hyperspectral cameras capture detailed images with 300 colour channels. The first algorithm processes this data to compute the band gap swiftly.

Stability Assessment: The second algorithm uses standard RGB images to monitor changes in the material’s colour over time, correlating these changes to stability.

Research and Development

The technique was developed and tested by MIT researchers, including graduate students Eunice Aissi and Alexander Siemenn, with contributions from their colleagues and international collaborators.

Validation and Accuracy: When compared to manual characterization by experts, the new method showed 98.5% accuracy for band gap estimation and 96.9% accuracy for stability, demonstrating both speed and precision.

Future Outlook

The researchers envision integrating this technique into a fully automated materials discovery pipeline, enhancing the speed and efficiency of developing new electronic materials. This innovation holds promise for significant advancements in renewable energy and electronic technologies.

This breakthrough in computer vision and materials science marks a significant step towards more efficient and rapid development of advanced functional materials, essential for the next generation of electronic devices.

For more detailed information, you can refer to the original study published in Nature Communications.

The post Breakthrough in Computer Vision Speeds Up Screening of Electronic Materials appeared first on ELE Times.

Cadence and Samsung Foundry Accelerate Chip Innovation for Advanced AI and 3D-IC Applications

Thu, 06/13/2024 - 15:02
  • AI digital and analog tools optimized for advanced node SF2 gate-all-around (GAA), driving enhanced quality of results and accelerating circuit process node migration
  • Cadence’s best-in-class 3D-IC technology enabled for all of Samsung Foundry’s multi-die integration offerings, accelerating the design and assembly of stacked chiplets
  • Cadence’s broad IP portfolio and tools for next-generation AI designs will enable customers to achieve first-pass silicon success and accelerate time to market

Cadence Design Systems, Inc. (Nasdaq: CDNS) today announced a broad collaboration with Samsung Foundry that includes technology advancements to accelerate design for AI and 3D-IC semiconductors, including on Samsung Foundry’s most advanced gate-all-around (GAA) nodes. The ongoing collaboration between Cadence and Samsung significantly advances system and semiconductor development for the industry’s most demanding applications, including AI, automotive, aerospace, hyperscale computing and mobile.

Through this close collaboration, Cadence and Samsung have demonstrated the following:
  • AI enables lower leakage power and development of SF2 GAA test chips: Cadence, in close collaboration with Samsung Foundry, has leveraged the Cadence® Cerebrus Intelligent Chip Explorer and its AI technology in both DTCO and implementation to minimize leakage power on their SF2 GAA platform. Compared to the best-performing baseline flow, the Cadence.AI result achieved a more than 10% reduction in leakage power. As part of this ongoing collaboration, a mutual customer is actively involved in the development of a test chip using Cadence.AI for an SF2 design.
  • Cadence backside implementation flow certified for Samsung Foundry SF2: As a result of extensive collaboration between Cadence and Samsung Foundry, a complete Cadence backside implementation flow has been certified for the SF2 node to accelerate the development of advanced designs. The full Cadence RTL-to-GDS flow, including the Genus™ Synthesis Solution, Innovus™ Implementation System, Quantus™ Extraction Solution, Pegasus™ Verification System, Voltus™ IC Power Integrity Solution and Tempus™ Timing Signoff Solution has been enhanced to support backside implementation requirements such as backside routing, nano TSV insertion, placement and optimization, signoff parasitic extraction, timing and IR analysis, and DRC. The Cadence backside implementation flow has been validated with a successful Samsung SF2 test chip, demonstrating the flow is ready for use.
  • Cadence has collaborated with Samsung Foundry to enable solutions for Samsung Foundry’s multi-die offerings: The Cadence Integrity™ 3D-IC platform is enabled for all of Samsung’s multi-die integration offerings, and its early analysis and package awareness features are now compliant with Samsung’s 3DCODE 2.0 version. In addition, Cadence and Samsung have expanded the multi-die collaboration by enabling differentiating technologies like thermal warpage analysis using the Cadence Celsius Studio and system-level LVS with Cadence Pegasus Verification System. Cadence is also supporting Samsung with a package PDK that reduces design time with the Allegro X system. Combined with the Integrity 3D-IC platform, it optimizes the package design flow.
  • AI’s Virtuoso Studio flow successfully deployed for analog circuit process migration: Purpose-based instance mapping in the AI-powered Virtuoso Studio provided rapid retargeting of the schematics, while circuit optimization in Virtuoso Studio’s Advanced Optimization Platform helped Samsung achieve a 10X improvement in turnaround time when migrating a 100MHz oscillator design from 14nm to 8nm. In addition, a FinFET-to-GAA analog design migration reference flow is available for joint customers, with successful experimental results.
  • Cadence mmWave RFIC design flow successfully used to tapeout 14RF circuit design: Cadence and Samsung successfully taped out a 48GHz power amplifier design, representing silicon validation of the robust, full system reference flow that leverages the Cadence EMX Designer to create passive devices with fast modeling and layout automation. Full design EM extraction with the EMX 3D Planar Solver and EM/IR analysis using Voltus XFi and Quantus ensured that the IC met aggressive metrics, Pegasus was used for signoff DRC/LVS, while AWR VSS provided a seamless environment to carry out initial system-level budgeting and post-layout verification. Mutual customers can feel confident utilizing this flow to deliver leading-edge designs to market in a timely manner.
  • Cadence Pegasus Verification System is certified for Samsung Foundry’s 4nm and 3nm process technologies: Through the collaboration with Samsung Foundry, the Cadence physical verification flow is optimized to allow mutual customers using Samsung Foundry’s advanced nodes to reach signoff accuracy and runtime goals for a faster time to market. The Pegasus system is now certified across multiple advanced nodes at Samsung Foundry, which are proven and in production by customers, with simplified, all-inclusive licensing support. The Pegasus system is integrated into the AI-powered Cadence Virtuoso Studio as iPegasus to enable in-design signoff quality DRC and interactive metal fill in the layout implementation, offering up to 4X faster turnaround times.
  • Cadence IP portfolio offers comprehensive industry solutions on advanced Samsung nodes:

Cadence’s latest IP built on Samsung SF5A includes industry-leading PHY IP for 112G-ULR SerDes, PCIe® 6.0/5.0, UCIe™ , DDR5-8400, DDR5/4-6400 Memory and USB 2.0, offering customers complete platform solutions.

Cadence’s PHY IP for PCIe 6.0 on Samsung SF5A has been successfully certified for PCIe 5.0 x8 compliance and demonstrated seamless interoperability with other PCIe 5.0/6.0 system and test equipment, further showcasing its PCIe solution maturity

Cadence is furthering its partnership with Samsung Foundry by pushing the performance envelope, designing advanced memory IP for GDDR7 on Samsung SF4X and SF2, and helping reshape the HPC/AI industry with this new memory standard.

  • Advanced verification for AI design complexity: Samsung Foundry applied Cadence’s advanced verification technologies, such as the Palladium Enterprise Emulation System, JasperC, STG, and Xcelium ML, to tackle rising AI chip complexity and achieve time-to-market requirements in SF3.

“We are honored to partner with Samsung, a true example of a chips-to-systems company, to bring this technology for our joint partners to design the next generation of intelligent systems,” said Tom Beckley, senior vice president and general manager in the Custom IC & PCB Group at Cadence. “The hyperconvergence of AI with modern accelerated compute requires a strong silicon infrastructure. With these new AI-powered, certified design flows and standardized solutions, mutual customers can confidently design for Samsung advanced nodes while achieving their design and time-to-market goals.”

“Samsung and Cadence have a close collaboration to advance technology and help our customers deliver competitive designs to the market efficiently,” said Sangyun Kim, Vice President and head of Foundry Design Technology Team at Samsung Electronics. “Our joint efforts enable customers to utilize Samsung’s latest process and technology innovations to push the limits for the most advanced AI, hyperscale computing and mobile SoC designs.”

To learn more about Cadence AI offerings, please visit: Cadence.ai.

The post Cadence and Samsung Foundry Accelerate Chip Innovation for Advanced AI and 3D-IC Applications appeared first on ELE Times.

Industry’s First PCIe 7.0 IP Solution for Next-Gen HPC and AI Chips Designs

Thu, 06/13/2024 - 14:03

Synopsys recently unveiled the industry’s first complete PCIe 7.0 IP solution, designed to accelerate trillion-parameter HPC and AI supercomputing chip designs. This new IP solution future-proofs bandwidth for hyperscale AI data centre infrastructure, addressing the demanding requirements of transferring massive amounts of data for compute-intensive AI workloads.

Key Highlights of Synopsys’ PCIe 7.0 IP Solution
  1. Complete Solution: Synopsys offers the industry’s only complete PCIe 7.0 IP solution, including the controller, IDE security module, PHY, and verification IP. This solution enables data transfers of up to 512 GB/s bidirectional in an x16 configuration.
  2. Power Efficiency and Low Latency: The pre-verified PCIe 7.0 Controller and PHY IP provide low latency data transfers and up to 50% more power efficiency compared to prior versions while maintaining signal integrity.
  3. Security: The Synopsys IDE Security Module for PCIe 7.0, pre-verified with the Controller IP, offers data confidentiality, integrity, and replay protection against malicious attacks, ensuring a secure data transfer environment.
  4. Experience and Reliability: With more than two decades of PCIe IP experience and over 3,000 design wins, Synopsys offers a low-risk path to silicon success, providing customers with a robust and reliable IP solution.

This solution is crucial for chip makers addressing the bandwidth and latency challenges posed by large language models and compute-intensive AI workloads. Synopsys’ PCIe 7.0 IP solution supports secure data transfers, mitigating AI workload data bottlenecks and enabling seamless interoperability within the ecosystem.

At the PCI-SIG DevCon in Santa Clara, Synopsys demonstrated the world’s first PCIe 7.0 IP over optics, showcasing the technology’s capabilities in real-world scenarios. This includes Synopsys PCI Express 7.0 PHY IP electrical-optical-electrical (E-O-E) TX to RX running at 128 Gb/s with OpenLight’s Photonic IC, and successful root complex to endpoint connection with FLIT transfer using Synopsys PCIe 7.0 Controller IP.

Synopsys’ PCIe 7.0 IP solution is part of a broader portfolio for high-performance computing (HPC) SoC designs, including solutions for 1.6T/800G Ethernet, CXL, and HBM. The company’s extensive interoperability testing, comprehensive technical support, and robust IP performance help designers accelerate time to silicon success and production.

Industry leaders such as Intel, Astera Labs, Enfabrica, Kandou, XConn, Rivos, and Microchip have embraced PCIe 7.0 for AI data center infrastructure, recognizing its importance in delivering high-bandwidth, low-latency connectivity critical for data-intensive and latency-sensitive workloads.

Overall, Synopsys’ PCIe 7.0 IP solution represents a significant advancement in enabling next-generation HPC and AI chip designs, providing a secure, efficient, and high-performance interconnect solution for the evolving demands of hyperscale AI data centers.

The post Industry’s First PCIe 7.0 IP Solution for Next-Gen HPC and AI Chips Designs appeared first on ELE Times.

Infineon receives German Brand Award for “Corporate Brand of the Year”

Thu, 06/13/2024 - 13:53

Infineon Technologies AG received the German Brand Award in the renowned “Best of Category” as “Excellent Brands – Corporate Brand of the Year”. The German Council of Design recognizes Infineon’s exceptional brand development, highlighting the company’s dedication to establishing a consistent brand that harmonizes seamlessly with its corporate strategy.

“To receive the German Brand Award as Corporate Brand of the Year is a special recognition for Infineon’s brand development over the past years,” said Andreas Urschitz, Member of the Management Board and Chief Marketing Officer of Infineon. “We are a global technology and thought leader with a clear vision and decisive actions. As a company, we are dedicated to driving decarbonization and digitalization through our solutions and in our business areas, together with our customers and partners. This commitment is deeply rooted in our corporate strategy, our brand, and within the entire global Infineon team.”

The award underlines Infineon’s commitment to excellence and innovation in brand strategy and design. It also reflects a strategic and decisive approach in the brand and corporate strategy, which ultimately enhances the company’s market presence with its audience.

The jury of the German Brand Award, which consists of members of the German Council of Design, acknowledged Infineon’s brand identity that resonates with its target audience while continuously staying true to its core values and vision. The jury’s statement states: “Infineon has been a strong brand for 25 years – and also ‘Corporate Brand of the Year’ in 2024. The semiconductor manufacturer has decisively developed its strategy and design to link the brand even more closely with the corporate strategy. The close integration, including vision, mission and values, is exemplary and contributes to an outstanding positioning. Only a few companies in the competitive arena have such a consistent and distinctive brand. The dedicated 360-degree brand development and, above all, implementation is credible and has a high unique selling point.”

The German Brand Award is the award for successful brand management, initiated by Germany’s design and brand authority. Judged by a top-tier jury of experts from brand management and brand science, the German Brand Award discovers, presents and honors unique brands and brand makers.

The post Infineon receives German Brand Award for “Corporate Brand of the Year” appeared first on ELE Times.

TI unveils industry’s first GaN IPM to enable smaller, more energy-efficient high-voltage motors

Thu, 06/13/2024 - 13:37

News highlights

  • 650V intelligent power module (IPM) enables more than 99% inverter efficiency for appliances and HVAC systems by integrating TI’s gallium nitride (GaN) technology.
  • Engineers can reduce solution size by up to 55% as a result of the IPM’s high integration and its efficiency which removes the need for an external heat sink.

Texas Instruments (TI) (Nasdaq: TXN) today introduced the industry’s first 650V three-phase GaN IPM for 250W motor drive applications. The new GaN IPM addresses many of the design and performance compromises engineers typically face when designing major home appliances and heating, ventilation and air-conditioning (HVAC) systems. The DRV7308 GaN IPM enables more than 99% inverter efficiency, optimized acoustic performance, reduced solution size and lower system costs. It is on display at the Power Electronics, Intelligent Motion, Renewable Energy and Energy Management (PCIM) Conference, held June 11-13 in Nuremberg, Germany.

“Designers of high-voltage home appliances and HVAC systems are striving to meet higher energy-efficiency standards to support environmental sustainability goals around the world, said Nicole Navinsky, Motor Drives business unit manager at TI. “They are also addressing consumer demand for systems that are reliable, quiet and compact. With TI’s new GaN IPM, engineers can design motor driver systems that delivers all of these expectations and operates at peak efficiency.”

Improve system efficiency and reliability with TI GaN

Worldwide efficiency standards for appliances and HVAC systems such as SEER, MEPS, Energy Star and Top Runner are becoming increasingly stringent. The DRV7308 helps engineers meet these standards, leveraging GaN technology to deliver more than 99% efficiency and improve thermal performance, with 50% reduced power losses compared to existing solutions.

In addition, the DRV7308 achieves industry-low dead time and low propagation delay, both less than 200ns, enabling higher pulse-width modulation (PWM) switching frequencies that reduce audible noise and system vibration. These advantages plus the higher power efficiency and integrated features of the DRV7308 also reduce motor heating, which can improve reliability and extend the lifetime of the system.

To learn more about the benefits of GaN technology, read the white paper, “How three-phase integrated GaN technology maximizes motor-drive performance.”

Advanced integration and high power density reduce solution size and costs

Supporting the trend of more compact home appliances, the DRV7308 helps engineers develop smaller motor drive systems. Enabled by GaN technology, the new IPM delivers high power density in a 12mm-by-12mm package, making it the industry’s smallest IPM for 150W to 250W motor-drive applications. Because of its high efficiency, the DRV7308 eliminates the need for an external heatsink, resulting in motor drive inverter printed circuit board (PCB) size reduction of up to 55% compared to competing IPM solutions. The integration of a current sense amplifier, protection features and inverter stage further reduces solution size and cost.

To learn about designing more efficient, compact motor systems, see the GaN IPM page on TI.com.

This high-efficiency, high-voltage GaN IPM is the latest example of TI innovations to help solve engineering challenges and transform motor designs.

TI’s reliable high-voltage technology at PCIM 2024

Visitors to PCIM can see new products and solutions from TI that are enabling the transition to a more sustainable future with reliable high-voltage technology in Hall 7, Booth 652. In addition to the DRV7308 GaN IPM, TI highlights at PCIM include:

  • Next-generation electric vehicle (EV) propulsion system: TI is demonstrating a new 800V, 750kW SiC-based scalable traction inverter system for EV six-phase motors, in collaboration with EMPEL Systems. The demonstration features high power density and efficiency using TI’s high-performance isolated gate drivers, isolated DC/DC power modules and Arm® Cortex®-R MCUs.
  • Speaking session: TI’s manager of high-voltage power systems applications, Sheng-Yang Yu, will speak on June 11 at Markt & Technik panel discussion: “Will SiC ultimately Hold its Own against GaN?”
  • Speaking session: TI’s manager of renewable energy systems, Harald Parzhuber, will speak on June 12 at Bodo’s Power Systems panel discussion: “GaN Wide Bandgap Design, the Future of Power.”

For more information about all of TI’s speakers and demonstrations at PCIM, see ti.com/PCIM. 

Available today on TI.com

Pre- production quantities of the DRV7308 three-phase, 650V integrated GaN IPM are available for purchase now on TI.com.

  • Pricing starts at US$5.50 in 1,000-unit quantities.
  • Available in a 12mm-by-12mm, 60-pin quad flat no-lead (QFN) package.
  • The DRV7308EVM evaluation module is also available at US$250.
  • Multiple payment, currency and shipping options are available.

The post TI unveils industry’s first GaN IPM to enable smaller, more energy-efficient high-voltage motors appeared first on ELE Times.

Anritsu, in Collaboration with Sony Semiconductor Israel, Acquires Industry-First GCF Certification for Non-terrestrial Network (NTN) NB-IoT RF Conformance Testing

Thu, 06/13/2024 - 13:17

Anritsu Corporation announces that the first NTN NB-IoT RF conformance tests have been validated on the New Radio RF Conformance Test System ME7873NR, powered by Sony Semiconductor Israel (Sony)’s Altair device.

The ME7873NR has acquired Global Certification Forum (GCF) certification for NB-IoT RF conformance testing in the Certification Agreement Group (CAG)#78[*] for the first time in the industry. The conformance tests are defined by 3GPP in TS 36.521-4 corresponding to the core requirements of TS 36.102 and have been submitted by Anritsu to the 3GPP Radio Access Network Working Group 5 (RAN WG5).

“We are pleased to collaborate with Anritsu on this important initiative,” said Levana Asraf Fouks, Sr. Director, System Validation & PM Manager, System Engineering at Sony Semiconductor Israel. “Our combined expertise means that our customers benefit from enhanced capabilities to meet their own evolving needs. By partnering with Anritsu from the early stages, we’re able to work towards a swift certification process for modules and devices. The validation of NTN NB-IoT RF conformance tests is a major step forward for the industry.”

“NTN NB-IoT, which is defined by 3GPP Release 17, is a standard determining the current use of NB-IoT in the NTN and enables new use cases and monetization opportunities for vertical industry segments,” said Keiji Kameda, General Manager of the Mobile Solutions Division at Anritsu Corporation. “We are proud that our collaboration with Sony enables us to help the industry validate new features so they can quickly reach the market and attain certification in GCF/PTCRB to realize new devices that enable new applications.”

Product Outline

The ME7873NR is an automated system for 3GPP TS 38.521/TS 38.533 5G NR RF and RRM tests.

The ME7873NR supports NB-IoT NTNs as specified in 3GPP TS36.521-3 and TS36.521-4, in anticipation of future support for 5G NTNs. Customers can also upgrade from ME7873LA to ME7873NR by simply adding a control PC. Anritsu contributes to a smooth transition from NB-IoT NTN to NR NTN.

The post Anritsu, in Collaboration with Sony Semiconductor Israel, Acquires Industry-First GCF Certification for Non-terrestrial Network (NTN) NB-IoT RF Conformance Testing appeared first on ELE Times.

STMicroelectronics releases energy-efficient autonomous inertial measurement unit with industrial product longevity

Thu, 06/13/2024 - 13:02

Targets vibration sensing and motion tracking in industrial and robotics applications

The STMicroelectronics ISM330BX 6-axis inertial measurement unit (IMU) combines edge-AI processing, an analog hub for sensor expansion, and ST’s Qvar electric charge variations sensing with product-longevity assurance for energy-efficient industrial sensing and motion tracking applications.

The IMU contains a 3-axis gyroscope and a 3-axis accelerometer with a low-noise architecture and bandwidth up to 2kHz, suitable for vibration sensing in machine-tool condition-monitoring applications. Additional use cases include industrial and domestic robots and automated guided vehicles (AGV), intelligent appliances, and motion trackers.

Also integrating ST’s edge-processing engine that teams a machine-learning core (MLC) with AI algorithms and a finite state machine (FSM), the ISM330BX offloads the host processor and saves system power. The IMU also embeds ST’s Sensor Fusion Low-Power (SFLP) algorithm for 3D orientation tracking, which can enhance energy efficiency in applications like robotics and smart safety helmets. Leveraging adaptive self-configuration (ASC), the sensor can also automatically optimize its settings in real-time for best performance and power.

With its autonomous capabilities, the ISM330BX alleviates data transmission between the IMU and host system, as well as offloads the main processor, ensuring low latency and low power consumption. The integrated analog hub provides more opportunities for energy-efficient system integration by directly connecting external analog sensors to the edge-processing engine for data filtering and AI inference. The many power-saving aspects of the ISM330BX help engineers realize innovative industrial sensors and other battery powered smart devices that can be used also to upgrade existing industrial assets making them smarter and ready for industry 5.0

With ST’s Qvar electric charge variation detector also built-in, the IMU can integrate touch and close-proximity detection, or value-added functionality such as water leak sensing, to boost system integration and energy efficiency.

The launch of the ISM330BX advances MEMS technology for industrial markets and further enriches the MEMS ecosystem by offering developers access to essential hardware such as the STEVAL-MKI245KA adapter board. There are also software resources including MEMS Studio and ready-to-go application examples in GitHub free of charge, presenting a collaborative environment for innovation.

Unveiling the future of industrial motion sensing, the ISM330BX IMU will be at the upcoming Sensor and Test 2024 event. at the launch, an insightful keynote speech will explore the innovative aspects of the ISM330BX to monitor robotic applications and its transformative potential for the future of motion sensing. A highlight of this event will be the dedicated demo session, where attendees can see for themselves the exceptional features of the ISM330BX for anomaly detection applications.

The ISM330BX is included in ST’s 10-year product-longevity program for industrial components and is in production now priced from $4.00 for orders of 1000 pieces.

The post STMicroelectronics releases energy-efficient autonomous inertial measurement unit with industrial product longevity appeared first on ELE Times.

Vertiv Concludes Masterclass Training on AI and Data Center Infrastructure in Bangalore

Thu, 06/13/2024 - 13:02

Vertiv Xpress Masterclass Roadshow provides training for local data center professionals on the impact of artificial intelligence

Vertiv (NYSE: VRT), a global provider of critical digital infrastructure and continuity solutions, concluded the Vertiv Xpress Masterclass Series for Bangalore, providing training on Artificial Intelligence (AI) and critical digital infrastructure to industry professionals. The event is an annual initiative that underlines Vertiv’s commitment to encouraging innovation and knowledge transfer within the industry. The key focus of the event was AI and its impact on data centers and supporting critical digital infrastructure. The event also highlighted the importance of power management and thermal management of critical digital infrastructure.

Vertiv Bangalore Class

Over the course of the event, consulting engineers, infrastructure designers, and stakeholders from diverse sectors came together to discuss pressing issues and challenges, with the program providing insights into critical digital infrastructure innovations and advancements that will support AI and high-performance computing (HPC) adoption. The event witnessed over 180+ attendees from industries including data centers, BFSI, manufacturing, healthcare and government personnel, amongst others.

“At Vertiv, we recognize the transformative potential of AI in the data center space. AI is not just an emerging trend but a critical component that is redefining the way data centers operate. The Vertiv Xpress Masterclass Series event in Bangalore with its deep focus on AI underscores our commitment to fostering innovation and equipping our partners and clients with cutting-edge solutions that drive the future of critical digital infrastructure,” said, B Venkat Rao, senior director, enterprise key accounts, Vertiv India.

For more information on Vertiv and its complete portfolio of digital infrastructure and continuity solutions, visit Vertiv.com.

The post Vertiv Concludes Masterclass Training on AI and Data Center Infrastructure in Bangalore appeared first on ELE Times.

Revolutionizing Industrial 3D Printing with ALTRA 280 and IPSO 105: High-Temperature Game-Changers

Thu, 06/13/2024 - 12:42

BigRep, a prominent player in the field of additive manufacturing, has developed two cutting-edge 3D printers designed specifically for industrial applications. The ALTRA 280 and IPSO 105, engineered to deliver speed, reliability, precision, and automation, mark a significant advancement in the realm of high-temperature 3D printing.

The ALTRA 280 and IPSO 105 represent BigRep’s commitment to meeting the evolving needs of industries such as aerospace, defence, and automotive. These printers are equipped to handle a wide range of materials, from standard polymers to high-performance ones, making them versatile solutions for various manufacturing requirements.

Key Features of ALTRA 280

The ALTRA 280 boasts a large-scale printing capability, with dimensions of 500mm x 700mm x 800mm, allowing for the production of intricate and sizable components. Its high-temperature capabilities, reaching up to 450°C, enable the printing of robust parts suitable for challenging industrial applications. The machine is designed for uninterrupted productivity, thanks to its dual extrusion system and backup systems, ensuring reliable performance around the clock.

Key Features of IPSO 105

On the other hand, the IPSO 105 offers a generous toolmaker’s build volume of 400mm x 600mm x 440mm, catering to a wide range of applications from tooling to end-use parts. With high-temperature capabilities of up to 180°C, the IPSO 105 can handle a variety of engineering-grade and high-performance materials, including multi-material parts facilitated by its dual extrusion system. The automated start and production processes enhance reliability and streamline operations.

Industry Impact and Significance

Both the ALTRA 280 and IPSO 105 are seen as game-changers in the field of industrial additive manufacturing. Their ability to accommodate diverse polymers, coupled with all-temperature capabilities, positions them as pivotal solutions for industries requiring high-performance parts. Thomas Janics, from HAGE3D, anticipates a transformative impact on additive manufacturing, particularly in sectors with demanding specifications.

Conclusion

BigRep’s introduction of the ALTRA 280 and IPSO 105 represents a significant leap forward in the realm of high-temperature 3D printing. These printers, engineered with a focus on speed, reliability, precision, and automation, are poised to revolutionize industrial additive manufacturing across sectors such as aerospace, defense, and automotive. With their advanced features and capabilities, they offer manufacturers the tools they need to stay competitive and innovative in today’s rapidly evolving manufacturing landscape.

The post Revolutionizing Industrial 3D Printing with ALTRA 280 and IPSO 105: High-Temperature Game-Changers appeared first on ELE Times.

Key Trends Charting India’s Economic Transformation

Thu, 06/13/2024 - 12:14

The latest EY Economy Watch report highlights several key trends shaping India’s economic transformation, including emerging technologies like generative AI (GenAI), climate challenges, de-globalization, and de-dollarization. These trends present both challenges and opportunities for India as it aims to achieve ‘Viksit’ status by 2047 and navigate its path towards becoming a developed nation.

One of the major potential growth drivers identified in the report is GenAI, which could significantly enhance productivity and output. The report estimates that GenAI has the potential to boost India’s GDP by US$359-438 billion by FY30. However, harnessing the benefits of emerging technologies while ensuring a net positive impact on employment will require strategic policy interventions.

Climate change is another critical factor affecting economic resilience, with increased natural disasters posing significant risks to the economy. The report emphasizes the need for investments in climate-resilient technology and innovation to mitigate these risks.

The ongoing trend of de-globalization and trade disruptions also presents challenges, but India has opportunities to strengthen its position as a trade connector due to its strategic location. Initiatives such as the operation of the Iranian Chahabar port and the development of new trade routes could enhance India’s trade connectivity.

Additionally, India’s lower government indebtedness provides a buffer against external shocks, giving the nation greater fiscal space to implement macro-stabilization efforts during economic downturns. However, responsible fiscal behaviour and careful policymaking will be crucial to avoid potential pitfalls in the growth process.

Overall, the report underscores the importance of strategic policy support and responsible fiscal behaviour in navigating these global trends and leveraging emerging technologies for sustainable economic growth.

The post Key Trends Charting India’s Economic Transformation appeared first on ELE Times.

How BMS Improves Range, Extends Battery Life, and Ensures Li-ion Safety and Reliability

Thu, 06/13/2024 - 10:44

With the electric vehicle (EV) market gaining momentum and driven by enthusiastic consumer interest, there’s an urgent need to tackle critical issues related to EV range, reliability, and battery life. Concerns about range limitations, compounded by concerns about limited charging infrastructure and safety issues highlighted in media reports, have hindered the adoption of EVs. In this context, the semiconductor automated test equipment (ATE) sector can play a crucial role in addressing and mitigating these concerns.

The Role of Battery Management Systems (BMS) in EV Evolution

Battery management systems (BMS) are integral to the performance and safety of EV batteries. By performing detailed cell-by-cell monitoring, a BMS can ensure battery longevity, safety, and an extended driving range. Automated test equipment (ATE), such as Teradyne’s ETS-800, is essential for testing the accuracy of these battery monitors.

Core Functions of BMS
  1. Monitoring and Managing Performance:
  • Ensures balanced charging across individual cells.
  • Uses semiconductors for multi-channel monitoring and balancing.
  • Manages charge distribution and monitors cell state and overall battery health amid temperature and voltage changes.
  1. Assessing Cell Voltage, Current, and Temperature:
  • Determines the state of charge (SoC) and health of the battery pack.
  • Relays information to a central processing unit for optimal performance management.
Overcoming Range Anxiety

Improving BMS technology can significantly reduce range anxiety among EV drivers by accurately measuring the vehicle’s charge status, thus optimizing battery use and extending the driving range. Enhanced accuracy in monitoring the SoC facilitates more efficient charge and discharge cycles, improving battery life and reducing the need for frequent charging stops.

  1. Accurate SoC Measurement:
  • Increases usable battery energy.
  • Extends driving range significantly.
  1. Competitive Advantage:

Combined with advancements in battery technology and vehicle design, manufacturers can offer greater mileage on a single charge.

Li-ion Batteries and BMS: Efficiency and Safety

Accurate monitoring of EV battery cells is crucial for managing range anxiety and ensuring the safety and reliability of Li-ion batteries. These batteries, while offering advantages in energy efficiency, density, and lifespan, require careful energy management to mitigate ignition risks. A sophisticated battery management system (BMS) has the capability to safely oversee charge and discharge cycles, leading to an extension of battery life while also minimizing the risks associated with fires.

Flexible ATE Solutions:

  • Support accurate SoC determination amid evolving battery chemistries.
  • Ensure safety and reliability.

Navigating the complexities of evolving EV battery technologies presents a fresh set of challenges

As the EV industry expands, it drives innovations in battery voltage and architecture, resulting in enhanced capabilities like extended range and rapid charging. This progression underscores the importance of future-proof test equipment that can accommodate new technologies and validate automotive quality standards.

  1. Advancements in Voltage Systems:
  • From 400V to 800V systems, requiring rigorous BMS innovation and testing.
  • Crafting strategies to alleviate consumer worries regarding range anxiety and battery longevity is crucial in the EV industry’s evolution.
  1. Advanced Instrumentation:

Companies in the semiconductor ATE sector, such as Teradyne, are addressing this demand by introducing solutions like the ETS-800 platform, specifically engineered for precise and high-voltage BMS testing.

Teradyne’s ETS-800 Platform

The ETS-800 platform from Teradyne represents a cutting-edge automotive test platform designed for high throughput and rapid time to market. Its precision instrumentation and assured specifications ensure stable and repeatable results across testers and devices. The multi-sector architecture of the ETS-800 seamlessly scales from single or low-site-count program development applications to high-site-count programs with minimal effort, facilitating automatic achievement of high parallel test efficiency.

 

In conclusion, the role of BMS in alleviating EV buyer anxiety cannot be overstated. As the EV market continues to grow, the semiconductor ATE sector’s contributions to BMS testing will be crucial in ensuring the safety, reliability, and extended range of EV batteries, ultimately driving broader adoption and acceptance of electric vehicles.

The post How BMS Improves Range, Extends Battery Life, and Ensures Li-ion Safety and Reliability appeared first on ELE Times.

SICK Unveils Inspector83x: A 2D Vision Sensor Equipped with AI-powered Quality Control

Thu, 06/13/2024 - 10:24

SICK’s latest innovation, the Inspector83x, integrates AI-powered quality control seamlessly into high-speed production environments. It empowers non-specialists to configure precise AI inspections quickly and effectively, thanks to its intuitive design and pre-installed SICK Nova foundation software. Advanced users can further customize inspections using Lua programming and HALCON, ensuring adaptability to diverse production needs.

The Inspector83x boasts up to 5MP resolution, built-in illumination, and a quad-core CPU for high-speed data processing directly on the device. Its capabilities include defect detection, anomaly detection, classification, OCR/OCV reading, and complex assembly verification. Additionally, upcoming features such as colour imaging will enhance applications like colour sorting and quality assurance.

One of its standout features is the ability to simplify complex machine vision tasks for non-specialists. With minimal training examples and the AI function combined with rule-based tools, operators can configure inspections effortlessly. This eliminates the need for extensive machine vision expertise or external consultants when adapting to product changes or new designs.

The Inspector83x also excels in data transfer efficiency, with dual ports for EtherNet/IP or PROFINET integration and a high-speed Gigabit Ethernet port for image data transfer and data logging. Its onboard inputs and outputs, along with precise image output calibration, enable seamless integration with machine controls for quick response and accurate results.

Overall, the SICK Inspector83x redefines AI machine vision by offering simplicity, power, and reliability, making it a valuable asset in various industries such as consumer goods manufacturing, food and beverage, automotive, and packaging.

The post SICK Unveils Inspector83x: A 2D Vision Sensor Equipped with AI-powered Quality Control appeared first on ELE Times.

Pico Technology Ltd launches advanced PicoScope 3000E Series PC-based Oscilloscopes

Thu, 06/13/2024 - 09:56

Pico Technology Ltd., a leader in PC-based test & measurement products, is proud to announce the expansion of its acclaimed PicoScope 3000 series with the introduction of the World’s first USB-powered 5 GS/s oscilloscopes. Two models, with 350 MHz and 500 MHz bandwidth respectively, have 5 GS/s sampling rate and 10 bits resolution (up to 14 bits with Enhanced Resolution). They set the standard for next-generation waveform measurement and analysis with power, performance and portability in the field of test and measurement equipment.

The latest additions to the PicoScope 3000 series combine cutting-edge technology with user-friendly design to meet the diverse needs of engineers, technicians and researchers worldwide. Key features of the new models include:

  • 500 MHz bandwidth, 5 GS/s sampling rate and 10 bits resolution
  • 2 GS Ultra-deep capture memory: Enables capture of long-duration signals at maximum sampling rate and is complemented with the PicoScope DeepMeasure tool that delivers automatic measurements of waveform parameters on up to a million waveform cycles with each triggered acquisition.
  • 200 MS/s 14-bit AWG / Function generator: Offers real-world waveform generation capabilities for a wide range of applications, eliminating the need for additional external equipment.
  • USB 3.0 Type-C connected and powered: Ensures high-speed data transfer and compatibility with the latest generation of PCs, simplifying connectivity and setup. An adaptor for earlier USB port types is provided.
  • PicoScope 7 User Interface for Windows, Mac & Linux with free updates: Offers a modern, intuitive interface that enhances productivity and workflow efficiency across multiple operating systems.
  • 38 Serial Decoders Included as standard: Facilitates the analysis of serial bus communications, simplifying debugging and troubleshooting processes.
  • Segmented Memory, Persistence and Fast waveform updates: Enhances waveform visualization and analysis capabilities, enabling users to extract valuable insights efficiently.
  • Advanced Maths, Measurements, Masks and Digital Triggering: Empowers users with advanced analysis tools for in-depth waveform characterization and interpretation.
  • Customizable Actions that users can set up to automatically perform in response to events in long-duration unattended soak tests.
  • SDK (Software Development Kit) that allows users to write bespoke applications with the provided drivers for Windows, macOS and Linux.

Trevor Smith, T&M Product Manager at Pico Technology, expressed his excitement: “With the introduction of our two new PicoScope 3000E series oscilloscopes, we’re redefining the boundaries of mainstream electronic test and waveform analysis. These instruments enable innovation, precision and limitless possibilities, one waveform at a time.”

Availability

The new PicoScope 3000 series oscilloscopes with 350 MHz and 500 MHz bandwidth options are now available for purchase through authorized PicoScope distributors worldwide and on picotech.com. For more information about pricing, specifications and availability, please visit picotech.com or contact your local distributor.

For further information please, visit https://www.picotech.com/oscilloscope/picoscope-3000e-series-500-mhz-5gs-digital-usb-oscilloscope

The post Pico Technology Ltd launches advanced PicoScope 3000E Series PC-based Oscilloscopes appeared first on ELE Times.

Pages