Speaker 0 00:00:01 You know, machine vision is not what a lot of people think it is today. Machine vision has been around for many, many years. Uh, and, uh, the folks who are, let's say new to the industry, maybe don't have a perspective on what machine vision is, what it does. And some of the differences between a traditional application based automation based machine vision, relative to, uh, computer vision and deep learning. And I enjoy talking about these topics. I hope that a, as we talk today, Jim, that we get some, uh, information that's going to help people understand machine vision, understand how they're going to be able to use it best for their applications and learn to Def differentiate what will be a useful in their automated applications.
Speaker 1 00:00:50 Hello everyone, and welcome to the robot industry podcast. I'm glad you're here and thank you for subscribing. My name is Jim Beretta and I'm your host, and it's my pleasure to have David deco with me today. David deco is a globally recognized expert in the integration of machine vision, robotics and industrial automation technologies. He's the principal vision systems architect for Integro technologies Corp, where he performs application evaluation and design of complex automated imaging solutions for inspection metrology and robot guidance. Prior to his position at Integro, he was staff engineer for intelligent robotics machine vision at FANUC, and earlier was the founder owner and principal engineer for two successful systems integration firms. His career in machine vision and industrial automation spans more than 35 years. David is the recipient of the AIS automated imaging achievement award honoring industry leaders for outstanding career contributions in industrial end or scientific imaging.
Speaker 1 00:01:53 He's a member of the association for advancing automation's imaging technology strategy board, and is a contributing editor for vision systems, design magazine, and technical advisory board member with society. Vision limited as a key educator within the industry. David has partnered in the training of hundreds of engineers as an instructor with the eighty-three certified vision professional program. He's also well-known for his frequent informative technical articles, papers, webinars, and conference sessions and classes covering a wide range of technical topics, including machine vision and enabling technologies in industrial automation. So that's a mouthful, David, and welcome to the podcast.
Speaker 0 00:02:34 Thanks very much, Jim. It's a pleasure to be here with you. Oh,
Speaker 1 00:02:37 I'm excited. I'm glad that you joined us and we both do a webinars for the <inaudible>. So it's kind of fun to have us both on a podcast together. Exactly. Hey, can you tell our audience just a little bit about Integro technologies and what you do there?
Speaker 0 00:02:53 Yes. Uh, integral is a large machine vision systems integrator. We specialize exclusively in, uh, solutions that involve machine vision, uh, for a variety of tasks, uh, whether it be inspection metrology, robotic guidance. Uh, and my job is to, uh, look at all of the systems that, uh, com are all of the system requests that come in and evaluate what the actual needs of the application are, and then develop the, uh, architecture of the, both the imaging, uh, the, a software solution and the automation that's going to make those systems, uh, the most successful and, uh, meet the customer's needs. And, uh, we get a wide variety of applications and that's really what makes the job of what has always made the job of machine vision fund is the diversity of what you work with.
Speaker 1 00:03:45 And of course, I come from a big robot integrator as well. And, um, and, uh, I CA we kind of laughed in our warmup call about, uh, you're getting the really hard jobs, like the easy jobs don't come to you, which would be nice, but you get the challenging stuff, which like you say, makes it very technically, uh, fun and interesting.
Speaker 0 00:04:03 Absolutely. And, uh, I think that I can say safely say over my career, I can't think of any particular application base or market uh, application-based that I haven't had at least a chance and, uh, and the pleasure of touching and, uh, uh, making some difference in that, uh, in that particular application use case.
Speaker 1 00:04:25 So I wanted to ask kind of premiere global perspective what's happening in automated machine vision.
Speaker 0 00:04:33 Well, I think that the bottom line is that machine vision is, uh, growing, or let's say continuing to grow, uh, kind of, as it has been. If you look at a statistical regression line ever since literally the mid seventies in industrial automation, uh, machine vision is just a fantastic enabling technology for, uh, for a whole lot of, uh, technology, uh, other technologies within, uh, the automation realm. And, uh, we, we can, we continue to see more rapid advances in, uh, components, uh, more, uh, which in turn drive, uh, advances in new and pretty exciting applications. Um, certainly in, uh, in really the last, uh, uh, 10 or 12 years, I even think I see continued or, uh, uh, increased acceptance within the, uh, within the marketplace of machine vision as a technology, the understanding of how it benefits industry 4.0 and smart manufacturing and, uh, delivers data, uh, to, uh, the AI that's, uh, uh, become more prevalent in pro in processing and understanding what's going on in a, in a manufacturing process.
Speaker 0 00:05:44 Um, I think the, uh, the other thing, if I were to look at something that is, uh, becoming a somewhat of an obstacle in my mind, it's the, uh, uh, the buzz and hype surrounding some of the new, and we want to be sure that that doesn't, uh, that doesn't dilute the, the, uh, understanding of the end user and the, uh, uh, people who work with the people who are going to use these systems. It doesn't dilute their understanding of what machine vision really can do for them, uh, with, or without, uh, the, all of the new trending technologies.
Speaker 1 00:06:19 And you and I are old enough to understand, and I love your analogy about the buzz and the hype of the new tech when, uh, when, when machine vision was early. And sometimes it would go in and because of lighting or because of technical results, they'd ended up turning it off because it, it just wasn't robust enough. And we we've got the robustness and we've got that acceptance. I think those are really, uh, really good things for the industry.
Speaker 0 00:06:42 Well, one of the, I think that's a incredible statement, a very, very, uh, in, in a very important statement. And one of the things you hear in the marketplace today, uh, is, oh, we have to, we have to, uh, incorporate this brand new technology. Uh, I won't, we can just pick any name you want, you want to pick, but we have to incorporate this technology. And I've actually heard people say this because most machine vision systems in the marketplace are now turned off because they don't work. And, uh, really the machine vision marketplace globally is, is what, uh, 13 to $14 billion, uh, global marketplace, depending on who you talk to. And I guarantee you, uh, that, uh, it's not that size of a marketplace. Uh, if all of these systems are being turned off and, uh, I, I assured all of the end users out there that machine vision systems are not being turned off.
Speaker 0 00:07:37 Does it ever happen? Yes. Why does it happen usually because of poor specification and poor design, which is, uh, one of my key jobs and one of my key interests, uh, but, uh, machine vision is a, is a valuable, uh, is an extremely valuable technology. I'll give you one more story related to that. Uh, I was talking to, uh, I was on a, uh, an expert panel a little while back, and, uh, one of the panel members, uh, coming from the AI environment was very, very, uh, uh, excited about a brand new application, uh, that was being done, uh, for inspecting, uh, French fries, uh, for inspecting French fries in the food industry. And I was just, uh, just internally laughing so hard. A French fry inspection was one of the first applications that was possible, or that was implemented in industrial setting in machine vision back at, uh, in about 19 79, 19 80. And I've actually done it myself and seen those applications, uh, back when I started in 83. So, uh, it's for those of you who've been in the industry about 10 or 12 years, just remember there's a, there's a many decades that have gone before with a very successful machine vision,
Speaker 1 00:08:52 And what's, uh, what's old is new again, or what's new is old again. I don't know. It's kind of funny. So, so let's talk a little bit about hardware, because I think that's kind of what we're talking about. Uh, so we've got software we've got, well, I'm sorry if we're talking about hardware, we got lenses, we've got lighting, we've got speed. What are some of the things that you're seeing in the hardware? And are you getting excited about this?
Speaker 0 00:09:13 Yeah, there's no doubt that the really the component advances are, uh, are awesome. Uh, it almost seems to me like they have, uh, ramped up the scale of advances in the last 10 to 15 years. Uh, not that, uh, hardware advances always didn't happen, but, uh, it seems that we're really seeing, uh, incredibly rapid, uh, advances and, and significant advances in all of these areas. Uh, the, uh, particularly speed of, uh, of both processing and speed of imaging has enabled a lot of applications that while you could do it before made it, uh, it was still more, it was more difficult, more costly, uh, higher speeds, uh, reduce, uh, camera count, they improve, uh, processing time and so on. So, uh, th that's been a huge, uh, benefit in, in including the advances in resolution of the cameras, but lighting and lenses, this, uh, it, there's a tangible, uh, acknowledgement nowadays of the need for competent illumination, competent, uh, optics for machine vision application.
Speaker 0 00:10:24 And I can't agree with that more. We, we make the application more reliable by adding and by incorporating, uh, competent and targeted illumination and, uh, optical components for the application. And, uh, so we see advances in, uh, resolution of, uh, optical resolution of lenses, advances in lens formats. And then, uh, all of the lighting changes of the availability of broader spectrum illumination, uh, particularly the availability of, uh, illumination in non-visible wavelengths, like near, near eye are in short way by our, uh, these are a great boon to, uh, to machine vision and I'll add it, it really, in the end makes the application less costly than it may have been before. We may have been able to accomplish some of these applications before, but they were more costly because the, uh, the componentry and the difficulty of implementing the componentry was harder, uh, and it's getting, uh, is still there's cost to the components, but it's getting easier to implement. And that really when you get right down to it, the cost of, uh, the cost of any system is in the engineering primarily and, uh, a little bit less than the current.
Speaker 1 00:11:36 So when, when you see an application come in and we're going to talk about some use cases in a minute, um,
Speaker 2 00:11:42 What, where, where do you start? Do you
Speaker 1 00:11:44 Start with the lens? Do you start with the lighting, uh, or do you start with a camera? Like, I'm just kind of curious,
Speaker 0 00:11:50 Or this is one of my favorite topics, and you'll, you'll notice, uh, in, in my webcasts and other other presentations, I get a little bit animated about this topic, uh, because it's what I do, but it's also what I truly believe is a, uh, a path to success in, in machine vision. And, and you can extrapolate it to any high-tech, uh, integration task. And the, the actual starting point, uh, has nothing to do with the components. It has nothing to do with the technology. The actual starting point is to understand the application, understand what the, uh, the process is, understand the needs of the application relative to the, uh, industrial process or the automated process, and really dig into that and, and get a good specification of what, uh, both the needs of the process and the needs of the customer are, uh, within that process.
Speaker 0 00:12:45 And, uh, ultimately, what is the outcome expected from, uh, adding this, uh, technology? This was going to say expensive, but adding this complex technology to your process, what outcome is going to be achieved? Um, I had an example just recently where, uh, I went with a, a salesman to a customer and the customer said, we want, we want a, uh, a camera that will guide this robot. Uh, sure, fine. We can, we can do that. But after walk, after walking through their, uh, their, uh, their automation process and understanding, and starting to ask questions and really understanding what the needs of the process were, it turns out that they didn't need a camera to, to guide a robot. They needed a camera for us for several other things that were actually the root cause of their, uh, of their issue and the, and going to be the root solution of their, of their application.
Speaker 0 00:13:42 So it's a, it's a common thing. People don't spend enough time on that. Um, now not to not decide step your question, if we, if we get right down to it, the first thing, the first thing in the, in any application where we're addressing a cam where we're using a camera is to, uh, understand what the, uh, what the field of view is technically, and then, uh, identify the magnification from there. I can spec anything, a lens, a light camera, uh, but once we understand that that's, that's, that's a great starting point. The, as far as cameras, cameras, lenses, and lighting, uh, it's funny, it's usually a kind of a, a combined process. The mind thinks about lens, light camera all at once, uh, after having gathered the right specifications.
Speaker 1 00:14:32 No, that's a great explanation. You probably have saved that customer or potential customer, like thousands of dollars and lots and lots of time by focusing in on what the actual use is of, of integrating vision technology. Absolutely.
Speaker 0 00:14:48 And also making sure they don't turn it off. Right.
Speaker 1 00:14:51 Absolutely. What are some of those use cases or trends, or maybe applications that you can talk about? Because we know there's lots you can't talk about.
Speaker 0 00:15:01 Well, there are many, there's so many diverse use cases. And again, I, I mentioned I have the pleasure of working as an integrator for most of my career and being able to see just anything from, uh, uh, from, uh, inspecting food product, uh, to, uh, measuring, measuring, uh, uh, missiles. So it's a, it's, uh, a very diverse, uh, automation environment that can use machine vision and can benefit from machine vision as a technology, uh, some of the most recent ones, uh, that I I've been involved with and being a technologist. Let me, let me approach the answer sorta from a technology point of view. Uh, some of the most interesting recent ones that I've been involved with are those that have been enabled by, uh, some of the, let's say trending, uh, technologies. And as I mentioned earlier, it's the, the, the, the advancement in components that continues to drive more and more, uh, uh, we'll call them trending tech, uh, trending applications.
Speaker 0 00:16:03 So it's the advancement of the components that are the drivers. Um, one is, uh, in, in pharmaceutical industry, uh, the ability of the, uh, the hyperspectral cameras, uh, particularly in a short wave re uh, SWR a short wave infrared wavelengths to differentiate the chemical compounds, uh, and even very closely, uh, uh, similar chemical compounds in a pharmaceutical environment. And, uh, this has been, uh, while hyperspectral is not new, this is, uh, the, the continued acceptance or the increased acceptance of these types of imaging systems is really making these applications, uh, more, more accessible. And that goes into the food industry as well. Uh, I'll also, I'd add, uh, the advances in thermal imaging, uh, for me have been pretty exciting, uh, in that, uh, we can do, uh, again, not a new technology, but, uh, with new, uh, components, uh, less, less expensive components, um, higher imaging rates, higher resolutions in these, in that type of a thermal imager, uh, we can do things like, uh, see, uh, the foil underneath the caps of, uh, of, uh, uh, bottles in a pharma or a food environment, um, and see how that foil has been seen, uh, see whether that foil has been successfully heat sealed to the cap, um, and, and a variety of other things.
Speaker 0 00:17:27 Of course, during the pandemic, we, uh, uh, we looked into a thermal imaging, uh, for, uh, temperature analysis, automated temperature analysis of people's faces, and it was rather successful. Thank goodness that's over, but it was rather successful at the time. So, uh, actually those are a couple of the application basis, I think are, are extremely interesting. The one thing I'd say about trending though, in applications and trending and technology is I never want to leave the impression, uh, with the end-users are with the, uh, those of you who are using the technology. I never want to leave the impression that you have to move towards the trending or pay attention to the trending technologies machine vision, as we already talked about is such a wonderful, mature technology that, uh, and so many things can be, uh, achieved just with, uh, the, the 2d grayscale camera. And, uh, I, I wouldn't, I, I wouldn't want everyone anyone to, um, hold out for, let's say, hold out for being able to use, uh, a, a trending quote unquote trending technology when that when your application may be, uh, perfectly solvable with, uh, uh, mature, uh, commonplace machine vision. Uh, and, uh, you know, if, if, if you need help, an integrator of course can help you with that. But, uh, uh, we want to make sure that all applications are, uh, all possible applications are addressed. So
Speaker 1 00:18:52 Kind of the meat and potatoes stuff, right. And that kiss principle, if you just need something simple. Absolutely. It's put something simple in there. Indeed. I have a question for you when you're into the hyperspectral. So that really means stuff that we can't see with our naked eyes, uh you're into infrared or near infrared, is that different lenses at diff different cameras? Like, what does that look like?
Speaker 0 00:19:14 Oh, it, it definitely is. Uh, uh, but let me, uh, just for the, for the sake of clarity, let's, uh, let me give a couple of words about that technology. Um, uh, again, I don't want to focus on any one technology, but it is an interesting one. And I think, uh, uh, we see, uh, folks, uh, having, uh, new applications hyperspectral doesn't necessarily mean a non-visible. Um, a lot of hype, a lot of hyper-spectral analysis has done in the visible wavelengths, uh, at the point of hyper, uh, but also down in the, uh, near I R and, uh, square wave links as well. Uh, the, uh, point of hyperspectral is that, uh, it it's hyperspectral because it's able to collect very narrow, uh, spectral bandwidth of the, uh, of the scene that it's looking at it at any one moment. Uh, and because of those bandwidth, because of the, uh, number of bandwidth it's able to collect of the reflected light, it acts as a, uh, spatial, uh, uh, spectral, uh, uh, spatial spectrometer.
Speaker 0 00:20:21 Now there's a spec, a spectrometer that can, that has, that has spatial content. And so, uh, we see, and we ended up seeing an image that is spatially, correct, but anywhere in that image, we can get the, uh, uh, again, chemical or the material content of the, uh, of the, of whatever's in that image. Uh, if it, if it is a, if it is something that can be, uh, that can be analyzed using reflected light in a wide weight wide wavelengths. So, um, but you, you make a very good point in that, uh, when we get into, uh, particularly into, um, uh, square rave blanks or long wave, uh, Longwave IRR, as in thermal imaging, uh, the need for it, we need special lighting. We need special optics to be able to transmit the swear content or the, uh, the square content. And even NIR, it looks is better with a specialized optics.
Speaker 0 00:21:19 Uh, so, um, this is just another point, uh, that we made earlier that the, uh, the lens manufacturers are advancing these technologies by providing a relatively, uh, accessible and relatively easy to find, uh, optics, and even the light, even lighting for this type of, for this type of imaging in the non-visible or in the hyperspectral non-visible range. We do a lot of imaging and non-visible even if it isn't hyperspectral, and that's, uh, that's another good, uh, wide, broad use case that where we can make a lot of inroads into, uh, into helping people with applications. David
Speaker 1 00:21:55 Was going to ask you a little bit about packaging. And when I say packaging, I kind of mean like the, the camera packaging, the lens packaging, what are your end customers looking for, or are they, uh, when it comes to the size of the weight, the cost of the, uh, the hardware,
Speaker 0 00:22:10 Well, uh, size and weight, um, I think the, uh, particularly camera and indeed the lighting manufacturers, uh, and, uh, lens manufacturers are doing the best job they can to keep the size and weight down. It is important. It is important, but there are, uh, just plain certain laws of physics that we have to, that we have to, uh, accommodate, uh, particularly in lenses, particularly in light. Uh, the cameras are getting smaller and smaller, but the, the sensor size is, is fixed at, uh, at certain physical limits relative to the resolution. Uh, and so in terms of size and weight, um, the components we have are excellent in the marketplace today, uh, but always a consideration. I don't think that, I think that, uh, we can rely on the manufacturers to provide us with size and weight. That is, um, again, within the physical limits of what's possible.
Speaker 0 00:23:03 So we, we do pay attention to that as we're doing the applications, uh, particularly if we have, um, uh, automation where the camera's going to be moved or held by a robot and, and things like that, uh, the cost is a very interesting, uh, interesting point. I loved the, I love the cost discussion. Uh, I alluded to this earlier, uh, the cost of, uh, cost of a system lies primarily in the, in the cost to engineer the system, uh, engineers are much more expensive than cameras. Believe me, uh, the, uh, and even if you're doing it in house, even if you are not an integrator, like, like we are, but if you're, if you're doing your own in house, the cost for your, on, for your engineering team to implement, uh, the, the, the, the correct, and let's make sure we use that term, the correct components for the application is often more expensive than the component.
Speaker 0 00:24:01 And so th th I think that's a consideration, many people overlook when they're considering the return on investment for machine vision. Uh, they see a high price on the, they see a high price tag on a particular camera, and some of these specialized cameras are very expensive, uh, but they don't understand that still, uh, the engineering involved with implementing that is more, is more, is more costly usually than the component itself. The other aspect is cost is, um, it's, it's not a matter of how much it costs is a how much, it's a matter of how much value it provides to you, uh, the, the cost, uh, while we might say, boy, I wish that, that, um, uh, $25,000 thermal camera only costs $5,000, but the, probably the marketplace and the pure technical reality is it's not going to, uh, come down, uh, to, to those kinds of cost levels. But if I can, if we can, uh, as an industry make that, uh, make the case that, that kind of an application, uh, re has a tremendous return on investment. And usually it does, uh, then the cost is, uh, again, once again, somewhat of a real irrelevant discussion, except for the, I guess, the shock factor when people find out how much some, uh, advanced technologies actually do cost,
Speaker 1 00:25:20 When you, uh, when you think about, uh, the applications, um, how can companies are interested in doing something maybe unique envision? How can they reduce their risk? Is it some kind of maybe engine engineered study that they can engage with an automation integrator?
Speaker 0 00:25:36 Oh, yes. Uh, and, uh, risk is, uh, is an important word in the automation of any technology, uh, any techno technology system, any automation system. And, uh, of course the goal of the let's let's, let's, let's separate this from end user to, uh, to, uh, integration partner. The goal of the end user is to mitigate their risk and the job of any, uh, integration partner is to take over that risk. And so I often say that, uh, the really the job of the integrator is, well, I like the phrase I like to use often is that we have to make it work, but really at the, at the most, uh, lowest level commercially, the, the job is to take over the risk of that application for the customer, uh, the customer or the end user certainly can do machine vision on their own, uh, even, but even with a simple system, uh, it, you, it usually, or it may be, uh, uh, of a greater importance to the end user, uh, to mitigate their risk by engaging a integration partner who can take over that job.
Speaker 0 00:26:46 And then in terms of execution, yes, uh, preliminary studies, um, preliminary feasibility studies, even online application studies, uh, to prove capability are very, very important. We do a lot of that, uh, upfront, and that's really part of what I do in, in the, in the whole scheme of things. And, uh, very often at a certain level of what we, what we provide is, is not, uh, uh, is not charged for it. We do it for free. Uh, we do a study or we do an, an, uh, an analysis of functional specification, uh, just as part of our normal process of analyzing the application. It protects us and the customer to know that we can do it, but, uh, for more advanced applications is particularly with, uh, new components, or let's say in situations where we can't really predict what's going to happen in the protect production environment. Certainly a feasibility study is a good way to, uh, to quantify that, that what you think is going to be your proof of concept.
Speaker 1 00:27:45 And I think it's a great approach because not only does it reduce risk for the customer who maybe they thought it was going to be a a hundred thousand dollars thing, and it ends up being a $50,000 thing, it also reduces the risk for the integrator. They don't have to charge so much, they become more competitive. Uh, so it, I think that investment, uh, that adding more time into your automation, um, journey, uh, is a really good idea. So, David, I want to switch the conversation a little bit and talk about something you mentioned earlier about artificial intelligence and about how AI is changing the software part of machine vision.
Speaker 0 00:28:21 Yeah. Great. Uh, AI, uh, let, let, let me, let me preface the answer in the discussion a little bit. Um, we use the term AI, we throw it around a lot, uh, and, uh, when it really comes down to it, uh, AI doesn't mean much of anything, technologically, uh, I don't mean to be to nitpick, but being a technologist. I say, what, what is AI? Uh, the, the sheer fact of it is, is that in, in our marketplace today, particularly in machine vision, um, we're using the term AI to mean deep learning, right? Um, it, it, it doesn't necessarily have to be deep learning. There are a lot of other things going on in AI, if you're a, uh, an aficionado of the technology, there's a lot of the, of the concept. There's a lot of other things going on, but for our purposes, when we say AI, uh, and I don't mean to correct the, the question when we say, Hey, AI, we're really talking about deep learning now.
Speaker 0 00:29:18 And the, of course, the bottom line is that deep learning, uh, since it really emerged back in 20 12, 20, I guess, emerging in that 2012 to 2015 period, and now more maturing, uh, deep learning is an incredibly valuable tool, uh, for, uh, machine vision, uh, and or computer vision. And there's yet another dichotomy of terms. And we don't really know, uh, people use them more or less interchangeably, but I'll stick with machine vision in the industrial sense. Uh, it's an incredible tool in, uh, in, uh, machine vision in the industrial sense when it's applied correctly. Um, the, uh, deep learning is not suitable for every application. Uh, it, uh, it is primarily good in inspection applications where a subjective analysis or a subjective, uh, analysis of the image or subjective decisions need to be made about the content of the image, particularly for defect detection or assembly verification, uh, even OCR.
Speaker 0 00:30:24 And these are the things that deep learning is incredibly good at, uh, in, again, in that context of being sure that we've specified it and implemented it correctly, one of the biggest problems of deep learning, uh, I shouldn't say problem, but one of the biggest, uh, I think obstacles in the marketplace for deep learning is the fact that it can't is that the re is the fact that the outcome can't be predicted in advance, right? Uh, and you hear this all the time. I'm not telling you anything new, but, uh, the, the outcome of, uh, the, the reliability of a particular deep learning implementation is hugely dependent upon the data. And you can, you can do a quick proof of concept with, uh, with, uh, quickly gathered images, a few quickly gathered images, but that doesn't predict the, uh, the final outcome. And it could be months, literally months before, you know, the final outcome.
Speaker 0 00:31:14 Why, uh, we're looking for defects in a production environment. How often do you really defects you don't, you don't get them all the time and you need to, you need to have defects to, uh, to train the model. And so, uh, we'd love, we love deep learning in the appropriate environments and where the customer is well aware of the process. Okay. The deep learning task, which is to collect images, uh, Trey have experts train the models, re-evaluate the models in terms of their capability and classification, reliability, and then collect more data and collect more data. A lot of people, uh, I have resorted to tuning, tuning the models, uh, getting a data scientist into tune the models. I think that the conventional wisdom right now particularly is, is being offered by some of the real experts in deep learning is that the, it has to be data centric is that the data are the important part of the deep learning task.
Speaker 0 00:32:10 Um, you can, uh, and, and really that's held up by some statistics. You can, uh, the statistics showed that you, that you can train models a lot and only gain a little bit of efficiency or a little bit better reliability. You can train data however, better and better, and it gives it results in a much better end result of the efficiency. I think the other, uh, the other thing I want to say about, I'd say about AI and machine vision is to remember that it's still machine vision, a one, a, one of my other favorite statements. It's, it's still machine vision. We still have to take an image. Uh, you may take your cell phone and hold it up in front of an object and move that cell phone around just until you get the perfect image and then snap the image, and then take a few of those and, and do a proof of concept and deep learning.
Speaker 0 00:32:57 But the reality is in the industrial environment, you don't get to hold a cell phone on the production line and move it around so that it sees every, every single thing that you want to see. And so, uh, the bottom line is, uh, imaging will still be an important part of the task and machine vision, even when we're using these very, very valuable tools of machine, uh, of deep learning. And, uh, so it, it becomes more of a hybrid solution. And that's what we've adapted here. And integral is more of a hybrid solution to deep learning, where we involve the, uh, where we involve the PR image acquisition and processing upfront, and then, uh, turn it over deep learning at the point where we have, uh, the suitable images and, uh, the data that are necessary to get the right models.
Speaker 1 00:33:43 David, you must talk a lot about strategies for data, too, with all of your clients and customers and people in the industry about this thing, that, especially those where we're collecting data and archiving data, and maybe it's a pharmaceutical application, or it's a medical device or nuclear. Somebody has to actually keep all this data. So, and I'm, I'm kind of wondering if this should be a different maybe podcast session, but what are some of the things you're seeing in this data it's must be hugely important?
Speaker 0 00:34:10 Well, I E and that's a, that's a very interesting question and it all, as you say, it, it almost kind of diverges from a machine vision as we talk about that, because, uh, and, and in particular, I have to say that some of the machines, uh, excuse me, some of the deep learning companies are, uh, uh, taking that direction, uh, as they talk about computer vision in industrial automation, it's not so much the fact that they are going to, uh, detect, uh, the defects, we'll leave that up to machine vision, but what, uh, what, uh, advanced computer vision and, uh, ed really, really advanced, um, deep learning can do is, uh, pull all of that stuff to the cloud. Uh, and we're talking tens of thousands or even hundreds and hundreds of thousands of images, uh, and even information from other sensing devices, uh, timestamped, uh, uh, all, all coordinated, uh, pieces of data and use that then to understand what's going on in the process.
Speaker 0 00:35:10 And I think this is a tremendous initiative. I personally don't get involved in that so much from the machine vision end. I think it's something that we see we are seeing in the industry. Uh, it, again, diverges from actual inspecting things to, uh, pulling those data, those data into the cloud and, and working on that data to understand what, um, what's going on in the process. I think the companies that are doing that are very interesting, uh, of course there's some big names there, like Microsoft, Oracle, uh, Intel, and so on. Uh, just to name a few, but, um, I think it's a very interesting side of our automation technology that certainly enabled by machine vision because we're gathering the images and we're, uh, we're effectively labeling those images by, uh, saying that's a good one, that's a bad one. And then they go up into the cloud that way it can be worked on
Speaker 1 00:36:00 David. This has been fun. Thank you very much for coming on. Have we forgotten to talk about anything that you'd like to address today?
Speaker 2 00:36:07 No,
Speaker 0 00:36:07 I don't think so. Jim, you know, me, I could, I could probably talk about machine vision for the next five hours straight, and I thoroughly enjoy talking with you as always, uh, and thank you for this opportunity to share some of these great ideas.
Speaker 1 00:36:22 If someone's listening and maybe wants to reach out, how best should they get in touch with you, please
Speaker 0 00:36:28 Do contact me. I, I, I really, I I'm sincere about this. I love to talk to people about machine vision. You can ask me questions and I will try to answer, uh, I don't, I won't ignore your emails, but contact me at, uh, D
[email protected]. Uh, best way to get a best way to get a quick access to me is just, uh, uh, look me up on LinkedIn. It's LinkedIn in David deco, all one word
Speaker 1 00:36:54 And deco is spelled D E C H O w. Just for those people who might not be able to find you.
Speaker 0 00:37:01 Thank you, Jim. That's correct.
Speaker 1 00:37:04 Our sponsor for this episode is Earhart automation systems, Earhart builds and commissions turnkey solutions for their customers worldwide. With over 80 years of precision manufacturing, they understand the complex world of robotics, automated manufacturing, and project management, delivering world-class custom automation on time and on budget contact. One of their sales engineers to see what Earhart can build for you. And Earhart is spelled E H R H a R D T. I'd also like to thank our partner. Eighty-three the association for advancing automation. They're the leading trade association in the world for robotics, vision and imaging motion control and motors and industrial artificial intelligence technologies visit automate.org to learn more. And I'd like to thank our partner painted robot painted robot builds and integrates digital solutions. They are a web development firm that offers SEO and social marketing and can set up and connect CRM and other ERP tools to unify marketing sales and operations, and
[email protected]. And if you'd like to get in touch with us at the robot industry podcast and by us, I mean me, Jim Beretta, you can find me on LinkedIn. We'll see you next time. Thanks for listening. Be safe out there. Today's podcast was produced by customer traction, industrial marketing, and I'd like to thank my nephew, Chris gray for the music, Chris Coleman for audio production, my partner, Janet, our partner is a three and painted the robot and our sponsor Earhart automation systems.