[00:00:00] Speaker A: We have opportunities for college educated technologists or even skilled trades. Electricians would be a very interesting path.
[00:00:08] Speaker B: Into machine vision because inevitably the projects.
[00:00:11] Speaker A: Are of a scale, a size that naturally being able to do a little bit of the wiring, a little bit of the communications is a really good skill set to have. And so that's one step from, you know, electrician into machine vision.
[00:00:30] Speaker C: Hello, everyone, and welcome to the Robot Industry podcast. My name is Jim Beretta. I'm your host and thank you for subscribing. My guest today is Andrew Perkins. He is the TrueLight business manager at TrueLight Machine Vision Integration. Hey, welcome to the podcast, Andrew.
[00:00:44] Speaker A: Hi, Jim. Thank you very much for having me.
[00:00:47] Speaker C: Hey, Andrew, how did you get into the industry? I'm always inquisitive about this kind of thing.
[00:00:51] Speaker A: It started with an interest in engineering and solving problems that hadn't been solved before and taking on unique challenges and working cross functionally with different disciplines, understanding that you can't be an expert in all things. So putting mechanical engineering, electric controls design, robot programming, PLC programming together, and creating with skill trades, these amazing machines that can do incredible things for customers to change their ability to compete and survive and thrive. And so for us, the opportunity to be part of that. What drew me into the industry, the development that comes from being part of high tech, rapid change, high pressure. It's not for everybody, but for those that enjoy it, it can really mold you and help you become not just a better teammate, but a better person. And that's been certainly the case for me as I've grown with boss, starting as a project manager and then helping Ben on the sales and marketing side with applications engineering and then tapping a little bit more into my marketing and product development skillset, getting into director of technology development, looking a little bit further down the horizon, not just what are the solutions that our customers need today, but what's coming and then now, recently, October 2024, transitioning into the business manager of Trulite. Trying to take a capability that we've developed since 2019 and turn it into something special and unique and give it the focus that we think it needs to, to really serve industry and serve boss, who's our biggest customer right now.
[00:02:25] Speaker C: So, yes, tell us a little bit about TrueLight. So it's been spun out of Boss and. But it's its own independent company, Correct?
[00:02:31] Speaker A: It is its own independent company, organically grown out of boss, still owned by Boss. Right now, if you were to ask me who is Trulite's customers, I would say today my Biggest customer is BOSS Innovations. And all of the custom automation solutions that BOSS is integrating for its customers that require machine vision or advanced robotic guidance is a core part of that overall turnkey integrated solution. The game plan is other manufacturers, you know, Trulite serving other manufacturers directly that are looking for tailored machine vision solutions, tapping into this expertise that we've been honing since about 2019 and then potentially other integrators, Light Boss, but they don't have a true light. And so the thought being it would be a little bit easier to work with us if we have that arm's length separation and a clear capability and expertise for how we could work together on the integration of machine vision.
[00:03:24] Speaker C: And you're in a separate building too, right? Like you're not. You don't share under the same roof.
[00:03:28] Speaker A: No, we did move to a separate location where depending on the light, at Veterans in Bradley in London. It's between two and seven minutes away, but it is a separate facility. Boss actually moved to this facility in 2021 as part of an idea to separate our non operational staff into a bit of a skunk works environment.
Sales, applications, engineering, technology development, to really start to think about future solutions and give the operational space in Dorchester the room it needed to grow.
And then BOSS has since moved to London and the core BOSS staff have relocated. I mentioned about October 2024 back to that London facility, this facility which is, we call it the Collider, it's a Western university building on the south end of London, very near the 401. It's now true Lights home and we're looking to grow into this space and make it our own and eventually change that color you see behind me. But right now it's still BOSS red.
[00:04:25] Speaker C: So tell me a little bit about what's going on in automated vision, like in the industry of vision. It's exciting, lots of changes.
[00:04:32] Speaker A: It's incredibly exciting. And that's in part the thrust for Trulite and giving BOSS the ability to tap into that innovation, which is rapid and really hard to keep up with. Trulite is part of boss's solution to keeping up with that innovation, creating focus and a priority on it.
[00:04:51] Speaker B: Where before it was a smaller part.
[00:04:52] Speaker A: Of the overall BOSS company, which is highly capable of doing assembly automation, laser automation, welding automation for manufacturers in the automotive, nuclear, mining, metals, general industries, often with vision, but now with Trulite, machine vision is the objective and specifically the integration of those solutions. And what we're noticing in machine vision is the product development and launch cycle is becoming Shorter and shorter. I would call that almost. With AI tools coming on rapidly, the time to get to alpha, that, that alpha model, that alpha solution that you can then iterate to beta, gamma and so forth, that's becoming quicker and quicker and quicker. So managing that, that change and figuring out how to move quicker, but still diligently and making sure that it's all a matter of risk management and making sure that we maintain our place as integrators, understanding that our job is to make it work, take that solution, that conceptual solution, the final five, 10% into production and sustain that through its lifecycle. We're noticing the hardware behind these machine vision solutions, whether it be the cameras, the 2D cameras, the 3D sensors, laser line profilers, they're becoming more, more of a commodity, incredibly competitive and a fierce space.
Lots of alternatives, the software and the tools. That's becoming more and more where we're seeing the differentiator and where skills development on the integration side is becoming critical for me and changing how I look at the skill set of a machine vision expert, professional programmer. Looking more and more into the software, computer engineering, trying to tap into university programs that are teaching people how to learn and develop with AI, where that's something that for me is kind of a disruptive force crashing into the side that's exciting and just trying to figure out how to harness it. And so one of those answers is bringing on people that are more and more comfortable using it, using it well and helping shape that future, trying to maintain focus on the fundamentals. There's a lot of change, there's a lot of innovation happening, software, tools, AI. Yes. But the fundamentals of system specification and optical design are still critical. And so that's still something we're trying to maintain a focus on. And we believe there's value in being experts in that. To make sure we're doing vision right, we're specifying vision right. You can do a lot of cool things with software and tools, but your job is a lot easier. It's a lot more robust if you've got good hardware, good fundamentals under the hood.
[00:07:28] Speaker B: So we tap into the A3 CVP program for that.
[00:07:31] Speaker C: Yes.
[00:07:31] Speaker A: So that's a core part of our development life cycle. We have two entry level programmers and that's part of their development is going to be go through that CVP basic to then advanced level, building out those fundamentals of machine vision while we take on these custom applications and try and do things in new ways with exciting new tools.
[00:07:50] Speaker C: So I assume thank you for that, so I assume too that you've got like your standard offerings, but also you do a lot of custom work, I should think.
[00:07:59] Speaker A: Absolutely. Jim, custom solutions are largely what we come up with at the end. It starts with a customer that has a process or a part that they're manufacturing. They're the expert in that process. It's often unique and even proprietary to them, that customer or their industry.
And so our process starts with applications engineering, where we apply what we know and specify a solution that we think is going to solve that customer's problem and then go through sometimes a proof of concept to de risk that.
And then throughout the course of time, leveraging best practices, industry developments, lessons learned, to try and add that level of standardization to our process and develop the way we think, the way we break down problems, the way we consult with our customers to make that better and better, but maintain flexibility on the solution side because it is so custom.
[00:08:54] Speaker C: Yeah. And I assume that you're also doing a lot of customer education too, Right. Because like industry's changing quickly, lenses and lighting and everything. So do you spend a bit of time saying, hey, listen, we got to define what a scratch is before we are going to try to look for a scratch?
[00:09:12] Speaker A: Absolutely. The definition of a defect is critical when you think about ultimately how a system is going to classify or categorize those measurements or those individual judgments which is going to make on each part, and then the ability to take that, that judgment when it's being done in an automated way or determination and recorded in a database, and then plot that, you know, in an SPC chart and give you short and long term information about how your process is running. The way you categorize those data points at the very beginning is very important. And so that is an exercise, sometimes can feel a bit like semantics, but it is very important to get to that definition of what we're going to call that not too important, the word that we end up using as long as we have that common definition. And in many cases that'll lead us to a different definition. I'll give you an example of an application we've worked on recently where the customer was looking for detection of scratches and dings and dents.
And when you get down to the root of what that really looks like to a vision solution, it's surface indentation.
It might be called a scratch, it might be called a ding, it might be called a dent. But what we're really looking for is that indentation in the surface and that's the defect and the level of that indentation is where we really zone in and specify a solution that can find that and call it surface indentation and leave some of the subjectivity out of the labeling process.
I'll give you another example where right now we have a customer where with a specific defect for weld inspection.
And we are going through a relabeling exercise, a blind study to get to the definition of what undercut is.
[00:10:55] Speaker C: Okay?
[00:10:55] Speaker A: And so we've. That's a defect that we've gone through a labeling exercise. We've trained a model on what undercut looks like, and it is now performing that judgment on production parts. And we're reviewing those, those judgments with the expert on the customer side. And we're seeing a difference, part to part, between whether that's called undercut or not. From our perspective, it looks very much.
[00:11:21] Speaker B: The same to the human eye.
[00:11:23] Speaker A: That looks like undercut. That looks like undercut. But ultimately what we need to do is come away with just a common definition of what it is and train the model to that definition and then deploy it so that it meets the customer specification, which ultimately, when you talk about what is a scratch, there's ultimately the definition of what is a scratch that is not a shippable scratch, right? We'll call that the red zone. You know, so a scratch that. That is at this level, or worse, that that's the red zone where we, we cannot ship that we cannot risk our customer being exposed to that. What is the green zone where you know that now your part is, or your process is performing at its optimal level, where all of your incoming sources of variability are minimized and you're running as good as can be. And then where's that yellow zone where you don't want to stop your process, but you want to be informed about what's going on, that you may have some levels of variability that potentially could be growing or creating that defect or that red zone, and you could correct it proactively.
[00:12:25] Speaker C: I love this idea about talking about use cases. Do you have any other use cases that are maybe top of your mind?
[00:12:31] Speaker A: Absolutely, one that's at the top of my mind related to our launch of 3Di weld inspection is detecting porosity, burn through or other surface defects in welds. And so we've got applications for weld inspection in the field, but when we recently demonstrated that at a trade show in March, a customer saw that, they saw that we were capable of detecting porosity and welds, and their problem was porosity and machine surfaces.
[00:12:59] Speaker C: Okay.
[00:13:00] Speaker A: And so that Very quickly allowed us to make that pivot that, yes, we have the ability to see porosity here.
We can see porosity in that machine surface. And so that was an interesting use case that pivoted from weld inspection. But that commonality that it's porosity in a slightly different form, but still capable. I mentioned scratches, blisters, cracks. That's an application we have right now in a die cast part.
And the value add here is not just catching those defects, but where we catch them specifically for this customer, that the value spikes in terms of an inspection solution, if we can catch those defects before they paint the part, because at that point it's still recyclable and so they can recover a lot of the value from that part. If you catch it there, if you find the scratch, the blister, the crack, once it's been painted, it's a lot more obvious then, so it's easier to see. But that part can't be recycled. So finding the right place to install that solution to maximize the value to that customer's process is also a key part of coming up with the right use case for machine vision. And then there's another example that we're commissioning right now for defect detection in a press cavity. So we've got a camera that's 12ft away taking a picture of a really dark press cavity and it's looking for a specific defect inside of this press.
[00:14:17] Speaker B: That is really infrequent.
[00:14:19] Speaker A: Doesn't happen very often, but when it does, the consequences are catastrophic. And so this is a long term commissioning activity that requires some anomaly detection that will eventually turn into a supervised solution. But that gets to another piece of the classification of the use case is frequency and severity. When you're talking about machine vision and detection of defects, that's a huge part of the consideration is to specify the right solution is how frequent is that defect and how severe is it when you have it.
[00:14:50] Speaker C: As being an amateur photographer, I kind of love some of the stuff. The camera's 12ft away. You're like, oh my gosh, how do you do that? Right, yeah.
[00:14:59] Speaker B: And that's really neat.
[00:15:00] Speaker A: And that's 2D. It's a 2D camera with some AI edge learning on board, but a CVP I mentioned, advanced, that was able to work in partnership with a partner to develop the right solution for that application. And what's interesting is it has the opportunity to transfer from, you know, not just that press, but to others. So you're right. The optical design, the lensing, the lighting and There's a picture of that camera on our social media on LinkedIn. So you can see just the size of the camera. It's the biggest one I've ever seen so far.
[00:15:32] Speaker C: That's great.
Thank you for offering those. So I kind of assume you're busy.
[00:15:37] Speaker A: Like oh my gosh, very busy right now. And like I mentioned earlier, BOSS has secured quite a bit of work right now that is keeping the True Light team very busy. And then we're in parallel trying to fill those peaks and valleys. And so the this year has been very busy for us. We're seeing machine vision or the opportunity to add it on just about every application that comes across our plates right now.
[00:16:03] Speaker C: Is there any big changes? And you kind of mentioned this in the intro in like lighting or in lenses.
I'm a big lens guy, I love big lenses. But are you seeing much change in those two pieces of hardware?
[00:16:17] Speaker B: There is development there as well. One thing I'm noticing is more and.
[00:16:22] Speaker A: More embedded lenses and lighting coming right in the sensor, especially on the 3D.
[00:16:26] Speaker B: Side of things, which offers some, some value in terms of the, the machine.
[00:16:31] Speaker A: Vision solutions of the early 2000s which often came with large disclaimers about sunlight and where you put the machine and no windows and things like that. And then I contrast that with our 3D AI weld inspection solution which comes with an embedded onboard laser light source calibrated from the factory. No external lighting required, no task lighting, no calibration required. When you're moving it from BOSS or True Light floor to customer floor demo cell moving from A to B to C.
So that's one thing we're noticing.
[00:17:06] Speaker B: And from customers, just the desire to remove that sensitivity.
[00:17:11] Speaker A: Whatever the end solution has to be. Remove the sensitivity of lighting, optics and then lenses certainly becoming part of the overall package. But more and more coming is something that is specified with the camera.
[00:17:25] Speaker C: From that OEM you mentioned about AI and about is AI pretty much every daily part of your life now it's coming on board.
[00:17:35] Speaker A: We've seen, I'll say the glass ceiling kind of break similar to 2016 when.
[00:17:40] Speaker B: All of a sudden cobots were no.
[00:17:42] Speaker A: Longer, you know, something that was unique.
[00:17:45] Speaker C: Yeah.
[00:17:45] Speaker B: Unique or theoretical.
[00:17:46] Speaker A: And everyone wanted one. There was mandates. Now you're seeing mandates for AI based solutions and AI is is broad term and it can mean a lot of different things.
When we talk about AI and machine vision, we're often in terms of how we use it for customers and to benefit them.
[00:18:01] Speaker B: Machine learning.
[00:18:02] Speaker C: Right.
[00:18:02] Speaker A: And training neural networks and algorithms to.
[00:18:05] Speaker B: To make Decisions that are human like or ideally even superior to human performance.
[00:18:11] Speaker A: Without some of the human factors, such as the deterioration over time when you're doing that dull or routine task over and over and over and freeing up humans to do the fun stuff that requires the variability and the dexterity and the flexibility of humans. And so the AI solution there is.
[00:18:28] Speaker B: Machine learning, but we're using it more and more in just how we work.
[00:18:32] Speaker A: And I've got an intern coming on board for the summer.
[00:18:36] Speaker B: I've been tracking.
[00:18:37] Speaker A: There's Western University is going to be graduating engineers in 2026 for the first time with a degree in artificial intelligence. And so that's one way how do.
[00:18:48] Speaker B: We keep track of this?
[00:18:49] Speaker A: There's only so much time short of going on sabbatical to keep up with this stuff at work.
[00:18:53] Speaker B: So tapping into the people that are learning this.
[00:18:55] Speaker A: And he's going to be here for the summer and one of his projects is going to be helping with synthetic.
[00:19:01] Speaker B: Image generation, which is something that in.
[00:19:03] Speaker A: The AI space is very cool.
If you think about machine vision and training a model, it requires often defects.
[00:19:11] Speaker B: Samples, and depending on where the customer.
[00:19:15] Speaker A: Is at in their product development lifecycle, this is a brand new piece of automation.
[00:19:19] Speaker B: The first parts that that machine is.
[00:19:21] Speaker A: Going to make are going to be.
[00:19:22] Speaker B: The machine that you built.
[00:19:23] Speaker A: And so getting defects that you can use to train a vision system, accelerating that timeline is going to become something that is going to be huge for Trulite. And there's companies coming online that can help you generate synthetic images using AI to generate images. I imagine as a photographer, not actually.
[00:19:40] Speaker B: Having to take the picture or taking.
[00:19:42] Speaker A: One picture and then asking for derivatives of that picture so that you can kind of create that distribution of data that you can then use to train.
[00:19:50] Speaker B: A model more and more robustly. And so I'm very curious to find.
[00:19:54] Speaker A: Out if you train a model with 100% real images, you train a model with 100% synthetic images, you train a model with a blend of both what.
[00:20:04] Speaker B: Level of precision and recall might you.
[00:20:06] Speaker A: Be able to get from that system?
[00:20:07] Speaker B: And can that accelerate our ability to.
[00:20:09] Speaker A: Serve customers, reduce the burden in terms of what we ask for them to start, which is often some level of sample of parts to do a feasibility study or something of that nature to de risk it for everybody?
[00:20:20] Speaker C: When you mention things like this, but this, my mind goes to data like it's like. So you must spend a lot of your time as well talking to your customers and internally and to some of the other experts about what do we do with all this data?
[00:20:34] Speaker A: Absolutely. And it comes down to a natural.
[00:20:38] Speaker B: We're seeing synergy between a good machine.
[00:20:41] Speaker A: Vision solution and data collection and visibility of that data.
[00:20:46] Speaker B: We're noticing there's a real reliance on data and not just to track that.
[00:20:51] Speaker A: Individual part, but to have traceability on that for long term, whether it be.
[00:20:56] Speaker B: Warranty or security in the field. And so that data is becoming a core part of every solution.
But then that brings us into the realm of cybersecurity. And how do you handle that data?
[00:21:09] Speaker A: Well, and so there are cloud based.
[00:21:12] Speaker B: Alternatives and then there are local alternatives. And as an integrator, we've had to maintain that flexibility depending on our customers to have the ability to save and transfer, transmit that data in a cloud based environment or in a local or on premise environment environment, which is still for many customers a mandate that all.
[00:21:33] Speaker A: Data has to be stored on their premises and can't leave those four walls.
[00:21:37] Speaker B: And so being able to harness data.
[00:21:40] Speaker A: Record it, use it for commissioning in.
[00:21:43] Speaker B: That local environment is still a challenge.
[00:21:45] Speaker A: But it's something that we have experience working within.
[00:21:48] Speaker C: Thank you for that. It's kind of mind boggling sometimes. It's like this onion that just. You keep exposing this other layer and this other layer and this other layer.
[00:21:56] Speaker A: What about.
[00:21:57] Speaker C: I'm kind of interested to hear if you use robot guidance with AI, is that something that you're seeing at True Light?
[00:22:04] Speaker B: Yes, we are. And robot guidance for thinking about the True Light side of that boss is very capable of putting 2D sensors on the end of a robot to do basic offsets, things of that nature. Trulite adds some value when it comes to the more advanced guidance. In particular pursuing solutions for random unstructured bin picking.
It's pretty amazing what a human can do. The infinite number of gripping positions, the.
[00:22:34] Speaker A: Path planning that a human does automatically.
[00:22:36] Speaker B: Without thinking about it, to not hit the bin, to prioritize the order at which they empty that bin, to not.
[00:22:42] Speaker A: Crash your knuckles into the adjacent part. There's amazing programming that you're doing on the fly.
[00:22:48] Speaker B: And so that's to be able to.
[00:22:50] Speaker A: Program that traditionally require a level of brute force that's just not practical.
[00:22:54] Speaker B: So AI controlling that path planning is a very exciting application for us that Trulite is seeing more and more opportunities to do to then feed into larger automated systems.
[00:23:05] Speaker C: And when you say don't hit the bin, you actually mean for the robot end effector or so that might come in contact with a bin that's been knocked around a lot over its Life and don't hit the bin is really important, right?
[00:23:18] Speaker A: Yep. Don't hit the bin.
[00:23:19] Speaker B: It might not be exactly the same shape as the last bin that was there or that bin might not be in the exact same position because it.
[00:23:26] Speaker A: Was dropped off by a forklift plus.
[00:23:28] Speaker B: Or minus a couple inches.
[00:23:30] Speaker C: Right.
[00:23:30] Speaker B: So you still need to be able to empty that bin, no collision. So AI plus Vision to guide those robotic paths is coming online and it's an exciting application and we're seeing it more and more and deploying it more and more in production.
[00:23:45] Speaker C: Enter what kind of vision inquiries do you get from customers?
[00:23:48] Speaker B: We see a spectrum of inquiries. It starts from a developed specification, a customer that has a fairly good understanding of their process, their part and what they need from the inspection system. And the other end of the spectrum is an open ended request for consultation, maybe even an invitation into their plant to try and identify what is possible. There's a need to get better, a.
[00:24:13] Speaker A: Desire to get better.
[00:24:14] Speaker B: They know that there's things out there they're also aware similar to truly that technology is changing. What's possible is changing. It's being redefined quicker and quicker. So being able to help them. But that consultative approach, while highly beneficial, sometimes takes time and trust to get to that point. You're often starting with that developed specification or trying to bid in a competitive fashion to come up with a solution that serves that need, that solves that problem.
But still, yes, competitively speaking meets their roi.
[00:24:48] Speaker C: So you go in and visit a customer, review their operations, maybe bring some cameras in yourself or so.
[00:24:53] Speaker B: Absolutely. And we like to say we're the experts at integrating machine vision, the experts at applying that technology, that software.
We want to tap into our customers expertise and their parts, their process, learn as much as we can from, from, from that plant floor, take that into our solution and if we can't do it on their floor, if we can't do it, gain it from just some of those conversations, something that we like to do. And it's getting more and more buy in from customers because of the mutual de risking it offers is a proof of concept.
[00:25:27] Speaker C: Right.
[00:25:27] Speaker B: Or a small engineering study that looks something like this, Jim, where we'll, we'll believe there's, there's something possible and we'll put together a very, we'll design a small experiment that can take place on their floor or in the true light lab with not to exceed time and materials amount and we will prove out.
[00:25:46] Speaker A: Yes, that is going to work.
[00:25:48] Speaker B: And then we'll add that production safety factor because we're cognizant of, we're doing it in the lab potentially. And then if we decide to move forward with a larger implementation of that, always offer that proof of concept amount. Whatever ended up being off the top of the solution because it allowed Trulite to offset some of our cost of sales, which is a part of our business.
And it had some skin in the game from their perspective.
But there's no obligation to move forward if that solution doesn't end up, you know, checking all of their boxes. But at least they've minimized in a risk managed way their initial investigation.
[00:26:24] Speaker C: I love these due diligence projects because they're really interesting and like you say, we can do it offline so we don't have to take your robot down or your production down. We'll just, we'll make sure it works and then we can.
And then like you say, you both have kind of skin in the game.
What does the future look like for vision? Do you have any predictions like you work in this all day long and it's been changing so much?
[00:26:45] Speaker B: Yeah, there's a couple things that are interesting that I'm curious to see. I'm looking at supervised learning in machine vision where you're labeling specific things to teach the model what to look for. And the biggest barrier to going faster with that and doing that better is the sample part requirements.
And so I'm curious to watch how generative AI in 2D, which is a little bit more clear to me, 3D is a little bit bigger to me, how you do it there, how that can accelerate the ability to deploy supervised learning models into the field quicker.
And will that go faster than the development of anomaly detectors, which is kind of doing things the other way, which is training the system on the definition of good and asking it to identify everything that is considered an anomaly. And then more and more those anomaly detectors are improving to the ability where with some of their edge learning and some of the data sets they've been trained on, they can even start to pick up on some common defects and do a little bit of, a little bit of labeling. So I'm very curious to see which of those paths is going to potentially provide true light and our customers with a faster way to get to alpha.
And then another thing that I'm looking at related to that trend is with AI tools coming online, you see the development of applications and software and tools to solve challenges, becoming faster and faster and faster with some of these tools.
But are all of these tools viable for production and how do you take these things that are developed very quickly into a prototype, fashion, Python scripts, things like that, and get those into a fashion where they can transmit or transfer into production. And so how do you harness that speed, I guess, and then translate it into production is pretty interesting. But I do see a future where.
[00:28:40] Speaker A: Robots all have eyes or they're all.
[00:28:43] Speaker B: Talking to one another. Yeah, more and more lights out manufacturing and then exciting opportunities for people to work on training of these algorithms and the support and the maintenance of them be commissioned and last five, ten years or more.
[00:28:58] Speaker C: How do I get into if I want, if I was a young person in, interested in robotics, interested in machine vision, what kind of courses do I take and where do I learn more about so I can position myself to do interesting things in AI and machine vision?
[00:29:12] Speaker A: That's an interesting question. And the one thing I would say that I'm realizing more and more is there isn't one right path.
We have opportunities for college educated technologists or even skilled trades electricians would be.
[00:29:31] Speaker B: A very interesting path, you know, into machine vision.
[00:29:34] Speaker A: Because inevitably the projects are of a scale, a size that naturally being able to do a little bit of the wiring, a little bit of the communications is a really good skill set to have. And so that's one step, you know, from, you know, electrician into machine vision design, mechatronics, design engineering at the university level. Great way in.
And then more and more I'm looking at computer science, software development as a bit of a background, AI as a bit of a background, realizing that machine vision, with our expertise here at trulite and the A3 program and things like that, we can teach the fundamentals of machine vision. But having that, that skill set to be able to solve problems that you've.
[00:30:18] Speaker B: Never solved before and have a mindset and an interest and an excitement about.
[00:30:24] Speaker A: That because very rarely do we have something come across our desk that is exactly like the thing we just did.
[00:30:31] Speaker C: Well, thank you very much for coming on today, Andrew. Have we forgotten to talk about anything today?
[00:30:36] Speaker A: Well, I would just mention if anyone.
[00:30:38] Speaker B: Is interested in getting a hold of.
[00:30:40] Speaker A: Me or Trulite, you can start with our
[email protected] we also have a presence on LinkedIn and we're. You can find us there. We're starting to share a bit more about what we're doing and then on YouTube as well. And then Trulite is going to be out automate in May. Yes, with one of our partners. LMI Technologies will be in booth, their booth 5200 with our 3D AI weld inspection solution.
So I'll be there for the full week, the 12th to the 15th of May, and I would be happy to talk to anyone that's visiting Automate in person.
[00:31:14] Speaker C: Great. Well, we'll see you in Detroit. Andrew, thanks very much for taking time out to chat with me today.
[00:31:21] Speaker A: Thank you, Jim, so much for having me on your podcast.
[00:31:23] Speaker C: I'd like to acknowledge A three the association for Advancing Automation. They're the leading automation trade association for robotics, vision and imaging, motion control and motors, and the industrial artificial intelligence technologies.
Visit automate.org to learn more. And if you'd like to get in touch with us at the Robot Industry Podcast, you can find me Jim Beretta on LinkedIn. Today's podcast was produced by Customer Attraction Industrial Marketing, and I'd like to thank my team, Chris Gray for the music, Jeffrey Bremner for audio production, my business partner Janet, and our and our friends at Bus Innovations and True Light Machine Vision Systems.
Have a good day and be safe out there.