HomeArtificial IntelligenceHussein Osman, Section Advertising Director at Lattice Semiconductor - Interview Collection

Hussein Osman, Section Advertising Director at Lattice Semiconductor – Interview Collection


Hussein Osman is a semiconductor {industry} veteran with over 20 years of expertise bringing to market silicon and software program merchandise that combine sensing, processing and connectivity options, specializing in modern experiences that ship worth to the tip person. Over the previous 5 years he has led the sensAI answer technique and go-to-market efforts at Lattice Semiconductor, creating high-performance AI/ML functions. Mr. Osman acquired his bachelor’s diploma in Electrical Engineering from California Polytechnic State College in San Luis Obispo.

Lattice Semiconductor (LSCC -12.36%) is a supplier of low-power programmable options used throughout communications, computing, industrial, automotive, and shopper markets. The corporate’s low-power FPGAs and software program instruments are designed to assist speed up improvement and help innovation throughout functions from the Edge to the Cloud.

Edge AI is gaining traction as corporations search options to cloud-based AI processing. How do you see this shift impacting the semiconductor {industry}, and what function does Lattice Semiconductor play on this transformation? 

Edge AI is completely gaining traction, and it’s due to its potential to really revolutionize whole markets. Organizations throughout a variety of sectors are leaning into Edge AI as a result of it’s serving to them obtain quicker, extra environment friendly, and safer operations — particularly in real-time functions — than are attainable with cloud computing alone. That’s the piece most individuals are inclined to concentrate on: how Edge AI is altering enterprise operations when carried out. However there’s this different journey that’s occurring in tandem, and it begins far earlier than implementation.

Innovation in Edge AI is pushing unique tools producers to design system parts that may run AI fashions regardless of footprint constraints. Which means light-weight, optimized algorithms, specialised {hardware}, and different developments that complement and/or amplify efficiency. That is the place Lattice Semiconductor comes into play.

Our Subject Programmable Gate Arrays (FPGAs) present the extremely adaptable {hardware} needed for designers to fulfill strict system necessities associated to latency, energy, safety, connectivity, measurement, and extra. They supply a basis on which engineers can construct gadgets able to holding mission-critical Automotive, Industrial, and Medical functions purposeful. It is a large focus space for our present innovation, and we’re excited to assist prospects overcome challenges and greet the period of Edge AI with confidence.

What are the important thing challenges that companies face when implementing Edge AI, and the way do you see FPGAs addressing these points extra successfully than conventional processors or GPUs?

You realize, some challenges appear to be actually common as any expertise advances. For instance, builders and companies hoping to harness the ability of Edge AI will probably grapple with widespread challenges, reminiscent of:

  • Useful resource administration. Edge AI gadgets need to carry out complicated processes reliably whereas working inside more and more restricted computational and battery capacities.
  • Though Edge AI gives the privateness advantages of native information processing, it raises different safety considerations, reminiscent of the opportunity of bodily tampering or the vulnerabilities that include smaller-scale fashions.
  • Edge AI ecosystems will be extraordinarily various in {hardware} architectures and computing necessities, making it tough to streamline elements like information administration and mannequin updates at scale.

FPGAs supply companies a leg up in addressing these key points by their mixture of environment friendly parallel processing, low energy consumption, hardware-level safety capabilities, and reconfigurability. Whereas these could sound like advertising and marketing buzzwords, they’re important options for fixing high Edge AI ache factors.

FPGAs have historically been used for capabilities like bridging and I/O growth. What makes them significantly well-suited for Edge AI functions?

Sure, you’re precisely proper that FPGAs excel within the realm of connectivity — and that’s a part of what makes them so highly effective in Edge AI functions. As you talked about, they’ve customizable I/O ports that permit them to interface with a big selection of gadgets and communication protocols. On high of this, they’ll carry out capabilities like bridging and sensor fusion to make sure seamless information change, aggregation, and synchronization between completely different system parts, together with legacy and rising requirements. These capabilities are significantly necessary as at the moment’s Edge AI ecosystems develop extra complicated and the necessity for interoperability and scalability will increase.

Nevertheless, as we’ve been discussing, FPGAs’ connectivity advantages are solely the tip of the iceberg; it’s additionally about how their adaptability, processing energy, power effectivity, and security measures are driving outcomes. For instance, FPGAs will be configured and reconfigured to carry out particular AI duties, enabling builders to tailor functions to their distinctive wants and meet evolving necessities.

Are you able to clarify how low-power FPGAs examine to GPUs and ASICs by way of effectivity, scalability, and real-time processing capabilities for Edge AI?

I received’t faux that {hardware} like GPUs and ASICs don’t have the compute energy to help Edge AI functions. They do. However FPGAs actually have an “edge” on these different parts in different areas like latency and suppleness. For instance, each GPUs and FPGAs can carry out parallel processing, however GPU {hardware} is designed for broad attraction and isn’t as nicely suited to supporting particular Edge functions as that of FPGAs. Then again, ASICs are focused for particular functions, however their mounted performance means they require full redesigns to accommodate any vital change in use. FPGAs are purpose-built to offer the most effective of each worlds; they provide the low latency that comes with customized {hardware} pipelines and room for post-deployment modifications at any time when Edge fashions want updating.

After all, no single possibility is the solely proper one. It’s as much as every developer to determine what is sensible for his or her system. They need to fastidiously take into account the first capabilities of the appliance, the particular outcomes they’re attempting to fulfill, and the way agile the design must be from a future-proofing perspective. This can permit them to decide on the proper set of {hardware} and software program parts to fulfill their necessities — we simply occur to suppose that FPGAs are normally the proper alternative.

How do Lattice’s FPGAs improve AI-driven decision-making on the edge, significantly in industries like automotive, industrial automation, and IoT?

FPGAs’ parallel processing capabilities are a great place to start. In contrast to sequential processors, the structure of FPGAs permits them to carry out many duties in parallel, together with AI computations, with all of the configurable logic blocks executing completely different operations concurrently. This permits for the excessive throughput, low latency processing wanted to help real-time functions in the important thing verticals you named — whether or not we’re speaking about autonomous autos, good industrial robots, and even good house gadgets or healthcare wearables. Furthermore, they are often personalized for particular AI workloads and simply reprogrammed within the discipline as fashions and necessities evolve over time. Final, however not least, they provide hardware-level security measures to make sure AI-powered techniques stay safe, from boot-up to information processing and past.

What are some real-world use circumstances the place Lattice’s FPGAs have considerably improved Edge AI efficiency, safety, or effectivity?

Nice query! One utility that I discover actually intriguing is the methods engineers are utilizing Lattice FPGAs to energy the subsequent era of good, AI-powered robots. Clever robots require real-time, on-device processing capabilities to make sure protected automation, and that’s one thing Edge AI is designed to ship. Not solely is the demand for these assistants rising, however so is the complexity and class of their capabilities. At a latest convention, the Lattice workforce demonstrated how using FPGAs allowed a wise robotic to trace the trajectory of a ball and catch it in midair, displaying simply how briskly and exact these machines will be when constructed with the proper applied sciences.

What makes this so fascinating to me, from a {hardware} perspective, is how design ways are altering to accommodate these functions. For instance, as a substitute of relying solely on CPUs or different conventional processors, builders are starting to combine FPGAs into the combination. The principle profit is that FPGAs can interface with extra sensors and actuators (and a extra various vary of those parts), whereas additionally performing low-level processing duties close to these sensors to liberate the primary compute engine for extra superior computations.

With the rising demand for AI inference on the edge, how does Lattice guarantee its FPGAs stay aggressive in opposition to specialised AI chips developed by bigger semiconductor corporations?

There’s little question that the pursuit of AI chips is driving a lot of the semiconductor {industry} — simply take a look at how corporations like Nvidia pivoted from creating online game graphics playing cards to turning into AI {industry} giants. Nonetheless, Lattice brings distinctive strengths to the desk that make us stand out even because the market turns into extra saturated.

FPGAs usually are not only a part we’re selecting to spend money on as a result of demand is rising; they’re a vital piece of our core product line. The strengths of our FPGA choices — from latency and programmability to energy consumption and scalability — are the results of years of technical improvement and refinement. We additionally present a full vary of industry-leading software program and answer stacks, constructed to optimize the utilization of FPGAs in AI designs and past.

We’ve refined our FPGAs by years of steady enchancment pushed by iteration on our {hardware} and software program options and relationships with companions throughout the semiconductor {industry}. We’ll proceed to be aggressive as a result of we’ll preserve true to that path, working with design, improvement, and implementation companions to make sure that we’re offering our prospects with probably the most related and dependable technical capabilities.

What function does programmability play in FPGAs’ capability to adapt to evolving AI fashions and workloads?

In contrast to fixed-function {hardware}, FPGAs will be retooled and reprogrammed post-deployment. This inherent adaptability is arguably their largest differentiator, particularly in supporting evolving AI fashions and workloads. Contemplating how dynamic the AI panorama is, builders want to have the ability to help algorithm updates, rising datasets, and different vital modifications as they happen with out worrying about fixed {hardware} upgrades.

For instance, FPGAs are already taking part in a pivotal function within the ongoing shift to post-quantum cryptography (PQC). As companies brace in opposition to looming quantum threats and work to exchange susceptible encryption schemes with next-generation algorithms, they’re utilizing FPGAs to facilitate a seamless transition and guarantee compliance with new PQC requirements.

How do Lattice’s FPGAs assist companies steadiness the trade-off between efficiency, energy consumption, and value in Edge AI deployments?

In the end, builders shouldn’t have to decide on between efficiency and risk. Sure, Edge functions are sometimes hindered by computational limitations, energy constraints, and elevated latency. However with Lattice FPGAs, builders are empowered with versatile, power environment friendly, and scalable {hardware} that’s greater than able to mitigating these challenges. Customizable I/O interfaces, for instance, allow connectivity to numerous Edge functions whereas lowering complexity.

Publish-deployment modification additionally makes it simpler to regulate to help the wants of evolving fashions. Past this, preprocessing and information aggregation can happen on FPGAs, decreasing the ability and computational pressure on Edge processors, lowering latency, and in flip decreasing prices and rising system effectivity.

How do you envision the way forward for AI {hardware} evolving within the subsequent 5-10 years, significantly in relation to Edge AI and power-efficient processing?

Edge gadgets will must be quicker and extra highly effective to deal with the computing and power calls for of the ever-more-complex AI and ML algorithms companies have to thrive — particularly as these functions turn into extra commonplace. The capabilities of the dynamic {hardware} parts that help Edge functions might want to adapt in tandem, turning into smaller, smarter and extra built-in. FPGAs might want to develop on their present flexibility, providing low latency and low energy capabilities for larger ranges of demand. With these capabilities, FPGAs will proceed to assist builders reprogram and reconfigure with ease to fulfill the wants of evolving fashions — be they for extra refined autonomous autos, industrial automation, good cities, or past.

Thanks for the good interview, readers who want to study extra ought to go to Lattice Semiconductor.

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisment -
Google search engine

Most Popular

Recent Comments