The Digital Supply Chain podcast

Computer vision in supply chain - a chat with Cogniac

November 16, 2020 Tom Raftery / Bill Kish / Chuck Myers Season 1 Episode 85
The Digital Supply Chain podcast
Computer vision in supply chain - a chat with Cogniac
Chapters
The Digital Supply Chain podcast
Computer vision in supply chain - a chat with Cogniac
Nov 16, 2020 Season 1 Episode 85
Tom Raftery / Bill Kish / Chuck Myers

Another of the cool startups I came across in the SAP.io New York Foundry is called Cogniac.

Cogniac is a company that uses convolutional neural networks to quickly and painlessly automate visual inspection tasks in all kinds of scenarios.

I invited Cogniac CEO Chuck Myers, and Cogniac CTO and co-founder Bill Kish to come on the podcast to tell me more, and we had a fascinating conversation where I learned how their solution is being used in scenarios as diverse as railroads, automotive companies, and timber yards.

This was a really interesting episode of the podcast. I thoroughly enjoyed it, and I hope you do too.

If you have any comments/suggestions or questions for the podcast - feel free to leave me a voice message over on my SpeakPipe page or just send it to me as a direct message on Twitter/LinkedIn. Audio messages will get played (unless you specifically ask me not to).

To learn how supply chain leaders improve end-to-end supply chain visibility, download the research study of 1,000 COO’s and Chief Supply Chain Officers – “Surviving and Thriving How Supply Chain Leaders minimize risk and maximize opportunities”.


And if you want to know more about any of SAP's Digital Supply Chain solutions, head on over to www.sap.com/digitalsupplychain and if you liked this show, please don't forget to rate and/or review it. It makes a big difference to help new people discover it. Thanks.

And remember, stay healthy, stay safe, stay sane!

Show Notes Transcript

Another of the cool startups I came across in the SAP.io New York Foundry is called Cogniac.

Cogniac is a company that uses convolutional neural networks to quickly and painlessly automate visual inspection tasks in all kinds of scenarios.

I invited Cogniac CEO Chuck Myers, and Cogniac CTO and co-founder Bill Kish to come on the podcast to tell me more, and we had a fascinating conversation where I learned how their solution is being used in scenarios as diverse as railroads, automotive companies, and timber yards.

This was a really interesting episode of the podcast. I thoroughly enjoyed it, and I hope you do too.

If you have any comments/suggestions or questions for the podcast - feel free to leave me a voice message over on my SpeakPipe page or just send it to me as a direct message on Twitter/LinkedIn. Audio messages will get played (unless you specifically ask me not to).

To learn how supply chain leaders improve end-to-end supply chain visibility, download the research study of 1,000 COO’s and Chief Supply Chain Officers – “Surviving and Thriving How Supply Chain Leaders minimize risk and maximize opportunities”.


And if you want to know more about any of SAP's Digital Supply Chain solutions, head on over to www.sap.com/digitalsupplychain and if you liked this show, please don't forget to rate and/or review it. It makes a big difference to help new people discover it. Thanks.

And remember, stay healthy, stay safe, stay sane!

Chuck Myers:

The the traditional vision systems that are out there today are just absolutely ripe for this change and it's craving this change because you want to go back and be able to find out, you know, what did I do wrong, what did I do right and where do I need to be in the future?

Tom Raftery:

Good morning, good afternoon or good evening, wherever you are in the world, this is the Digital Supply chain podcast, the number one podcast focussing on the digitisation of Supply chain. And I'm your host, Global vice president of SAP. Tom Raftery. Hey, everyone, welcome to the Digital Supply chain podcast. My name is Tom Raftery with SAP and with me on the show today, I have two guests, Bill and Chuck. Bill, would you like to introduce yourself and then Chuck?

Bill Kish:

Sure. Hi, I'm Bill Kish. I'm the CTO and co-founder at Cogniac Corporation.

Tom Raftery:

And Chuck.

Chuck Myers:

Hi, I'm Chuck Myers. I'm the CEO of Cogniac Corporation.

Tom Raftery:

Superb, superb. Guys, welcome to the show. Can you tell me what is Cogniac?

Chuck Myers:

Cogniac is machine vision and computer vision system, primarily around the industrial and automation space for identifying anything that needs to be identified at a super human level in a visual environment. If you can see it in an image or you know what you're looking for, we can identify it at a super human level. OK, and then I'll turn it over to Bill since his his technical vision that created it.

Bill Kish:

Yeah, absolutely. If you can see it in images or video, you can train Cogniac. We provide superhuman visual inspection capabilities to domain experts in a variety of industrial and manufacturing and other other environments. We really put the power of modern AI techniques into the hands of subject matter experts. People who are working in industrial or manufacturing environments typically and have things that they want to optimise in their processes or in their operations. They can very quickly do that with Cogniac. I can get it into production in hours or days versus traditional AI workflows, academic workflows that typically require weeks and months of effort.

Tom Raftery:

Superb, before we get further into Cogniac, I should also mention that you guys are a Start-Up and you are part of the SAP.io Foundry in New York. Can you just talk to me a little bit about that first? How did you become involved in that and what's it doing for you?

Chuck Myers:

I think that we were I'll touch on that, and we were originally approached through SAP.io as a potential investment through the SAP investment group and we found ourselves in the New York cohort. I think primarily one of our co-founders is Amy Wang. She's a she's a PhD, actually did her PhD at Columbia and in software development or computer science. And so we're part of a pretty diverse group. Bill and I are probably the only non diverse employees in the company. And we have a you know, we really kind of thrive on our diversity. And I think that's how we ended up in the New York cohort instead of others. Maybe in the Bay Area. We're really focussed on kind of partnering with SAP'. And so far it's going pretty well. I have to say. They've done a phenomenal job and really trying to bring you in and be very inclusive in a very short period of time. We only have about three months as part of the development process with the cohort and integrating our applications into the app marketplace within the SAP' infrastructure. So it's it's a bit overwhelming in the short term, especially based on the fact work with covid and coming in at the holidays. But it's been really an advantage that you wouldn't get if you were just a company coming in from the outside. There's a very preplanned, pre-programmed introductory introductory experience that's been it's been quite helpful. Bill's got any comments on that?

Bill Kish:

It's been good to get kind of first hand support from as SAP experts on integration with the various as SAP APIs. It would take us a while to sort that out ourselves. So it's very helpful and good.

Tom Raftery:

And where you guys, in terms of being a Start-Up, where you guys in terms of how mature you where you with funding has a lot going?

Chuck Myers:

We're about three a little over three years in the product is extremely mature for for a company at this stage. Bill really locked himself down with about sixteen of the best AI people he could find that are world class. And the really spent the original lay around, which was roughly about ten million dollars investing purely in product development. When I joined the company in February, I was brought on really to build up the sales and marketing and the go to market strategy of the product. With that, we've raised another ten million dollars to be. Almost purely invested in sales and marketing. So we're just kick off that process. Now, obviously, things have been it's been an interesting concept with covid, but I will say we did very well. There was a lot of interest financially in the company and we're going to continue to see that. And we expect to raise another round in the spring to really kind of blast off the company. We feel very comfortable with where we are technically, and we've had some very good wins with customers even during the middle of Covid.

Tom Raftery:

Superb. That's great. That's great. And so you mentioned it's a computer vision application. What problem is it solving and for whom typically.

Chuck Myers:

Bill you want to take that one?

Bill Kish:

Yeah, sure. So the the core problem is building out inspection models that can learn from a subject matter expert automatically. We're trying to take existing inspection workflows or even things that weren't possible to be automated before and be able to automate those. But without needing a computer vision expert, without needing a machine learning expert, without needing AI scientists. We really want to if a subject matter expert can point to something in an image, we want to be able to take that and automate that process and be able to very seamlessly integrate it with a real world environment. Now, machine vision has been around for decades, mostly in a very limited manufacturing context, but traditional machine vision very fragile. It starts with an engineering project. You have to get the lighting just perfect. You have to get the the thing that you're looking at fixture and presented just right. You have to select the lenses and cameras and all these things. But convolutional networks and AI technology in general make the software much more tolerant of imperfections in the environment and the imperfections in the image. And you can accept much more real variation now using convolutional neural networks and other techniques. And so the challenge now is to be able to dramatically reduce the effort involved in putting these types of applications in production. At the same time, they're dramatically expanding the the range of things that you can actually successfully inspect. So that's really the power of convolutional now networks that they really allow you to work with 10 or 20 times more different types of tasks than you could with traditional machine vision. But now our job is to make it 10 or 20 times easier to actually get those apps into production by reducing the engineering overhead up front.

Tom Raftery:

OK, and that's the kind of problem it's solving, I guess. But for whom is are you solving that problem? Who would be a typical customer?

Bill Kish:

So our first couple of years, we actually did a deep dive in that in the freight rail industry. It turns out they have just almost unlimited amount of visual inspection that they need to perform. It's a highly regulated industry. It's extremely safety conscious industry. It's also one hundred and fifty year old industry. They're very, very conservative. But do the challenges in that in that in that industry and the need to have increased efficiencies, they really have to invest in technology to drive their their operating metrics. And they have been looking at computer vision for the better part of a decade. They've seen it on the horizon. The more forward looking ones, especially, have been investing for four years in this area. And even before we came along, they convinced themselves that this was an area that was going to be of high relevance to them. Now, they were struggling to actually put into production. They knew theoretically this stuff works, but to get it into production was a whole nother ball game. And so when they saw our system and how easily it allows them to build these inspection models and not only build them and control them, maintain them and move them into production, that's really where they saw that the path forward to a full scale production. And so that's that's a typical case. Just an example of some of the applications we're doing there. This is one of the largest railroads in North America, one of the largest because they have tens of thousands of miles of track. We're expecting something close to twenty two million wheels a month with them looking for cracks or other problems in the wheel that could lead to a imminent derailment. In fact, we've been told that there's been dozens of instances over the last year where they had a call in radio up the train and tell it to stop literally stopping a freight train because of. Defects that our system is found in the wheel that was put in that train at imminent risk of derailment. So I think that even though we haven't even though we're still young company, I think it speaks to the level of of refinement and prioritisation we have in our system now that this very conservative railroad is now relying on us for such a mission critical function.

Chuck Myers:

I'd like to add to that that Bill didn't touch on is those twenty two million or there's 20 other twenty two million train wheels a month that we look at are images taken at 60 miles an hour.

Tom Raftery:

So that's what I was going to ask because from from what Bill was saying, they had to call up the train and get it to stop. So the images were taken of wheels in motion?

Chuck Myers:

Correct. And I think what you're talking about there, we'll go back to our super human capability. Again, we're not just looking at one defect at one time, which is kind of a traditional machine vision problem. We're doing something to super human. You know, no company in the world could afford to have a group of, you know, workers looking at twenty two million images and make a legitimate decision and enough time for a safety violation. Now, we still put a human in the loop, but it's really a domain expert. So this is the software really identifies a defect, puts that up to a mechanical expert that makes a determination on this crack or this brake. This is sufficient that we need to stop and fix it. You know, a typical train derailment, even with no side effects, is about a 20 or is about a two million dollar problem. If all things are perfect and there's no cargo, as you can imagine, that that derailment happens on a bridge or happens with a chemical car. I mean, it's huge. And, you know, our our customer has gone above and beyond what is really in the industry today for the implementation here. And it's just it is superhuman.

Tom Raftery:

And just just as a as a practical question in this scenario, are the cameras mounted on the tracks by the side of the tracks, on the trains? How is that done?

Bill Kish:

This case is what they would call on the railroad industry wayside, which means next to the tracks, right outside the sort of the safety zone that they can't have things in. And they're looking at the at the wheels as they roll by. Each wheel is captured a couple of times. So you get kind of both the top and the bottom from from all the views that you need to see it. And those are those are actually processed in real time in one of our products, the car airflow, that's in what they call the data bungalow. The data bungalow's just like a little environmental cabinet right next to those cameras. And everything's processed out there at the edge because you can't really rely on the Internet connectivity to move all this high bandwidth back from these remote locations. If you can think about where railroads operate and the industries that they serve, they're not exactly typically in the near downtown metro area served by great fibre infrastructure, oftentimes out in the middle of South Dakota. And so it's important to have the end to end solution. So not only do we have our centralised system for building and maintaining these models, we have the complete and end solution for pushing these models out to the edge where they can execute next to the cameras.

Tom Raftery:

Fantastic. And I, again, correct me if I'm wrong on this, but I assume if a train has, let's say, two hundred wheels or four wheels or whatever it is, and they all go by the cameras and the camera goes, this one's fine, this one's fine, this one's fine, this one's fine. And so on and so on. And hopefully for most trains going by, that's it. It probably does it does it send back four hundred. This one. This one to find this one to find this one is fine or does it send one back going all four hundred or fine.

Bill Kish:

So, so that's... The way to think about it. It's like we're this age filter, we're filtering out all the normal stuff and elevating the things that are most unusual or most interesting to the subject matter experts. So the top point, oh, whatever. Percent of those train wheels that look suspicious will get uploaded back to the subject matter experts. They're the mechanical disk, their operation centre to take a look at this. And that's the sort of the human in the loop that works so well here, especially in safety critical type applications. Sure.

Tom Raftery:

Sure, sure. Now, you mentioned that earlier on. If I heard you correctly, you mentioned that it that it can be taught to recognise anything or words to that effect. So obviously, it's not just train wheels and what else what else can cognitive help out with?

Chuck Myers:

So a couple of examples to to follow up on Bill's comment that he did. He glossed over in that particular railroad. We're monitoring about thirty thousand miles of track in near real time. And that is. Those are cameras mounted on our revenue train. So they're travelling all the time updating the images of traffic anywhere and that that's every nut bolt screw, joint bar bonding wire. So we're looking for cracks, warps missing round. Not a ball hex, not a square, but not so. Anything you can identify, we'll look for. We're doing things in the automotive industry that have never been done before where we're looking at, you know, big giant stamp parts. We're talking 10 feet by seven feet that are coming off a stamping line every four seconds where we're looking for millimetre size splits in that stamp part. This is a project that the car industry spent and one in one of the largest car companies in the world, it's our customer spent 10 years trying to solve with traditional machine vision. But frankly, it's just impossible and spent a year really trying to work with another company to build out their AI solution with the typical academic model and weren't able to do it. We were able to go in in a matter of a couple of weeks training on a very small sample set to wear, and that we're now in the high nines of accuracy and the software has gotten so good we're actually finding these splits before they become splits. So again, it's superhuman level. I mean, when you have a 10 foot by seven foot part, even with two people in Observer pulling his parts off every four seconds, you just can't look at the part fast enough to look at what the problem is. And it's not just about the part, it's about the escape problem. So if that part escapes and moves on to the next set sequence in the assembly line, this part is a is a critical part and it may not be discovered until it's made it all the way through pain. And at that point, if you have this defect in there, you might have to scrap the entire chassis of the car. So you can imagine it's not about the few hundred dollars for the part, it's about the assembly part. That's tens of thousands of dollars and all the work that's gone into putting it together. We do we do things like we do things for the government, where we absorb satellite imagery and cell phone imagery. We do things we're doing battery inspection for lithium ion batteries where we're after the battery is assembled, we're ingesting x ray images to look for single strands of small wire. Are they broken? Are they pinch to make sure that the thermal circuitry in the lithium ion batteries, as we all know, we have there's battery thermal problems, and particularly in automotive, those are big batteries. And you really want to be careful. We've looked at everything for particulate in injectable Lyall's for pharmaceuticals to look at, for glass sloughing in those. Is it a bubble or is it actually a piece of glass or is it a human hair? So we can look at things down to that. We're looking at nano level stuff for silicon chip inspection. So there's a variety really, again, anything, if you know what you're looking for, whether it's from an electron microscope all the way up to a satellite, we can probably identify it. I think probably one of the most interesting and frankly, complex projects we work on is for a major paper processor, one of the biggest in the world, where we actually look at trucks with logs on them, timber coming into mills and for various mills, strand board mills, paper mills, plywood mills. And what we're looking at as these trucks come in, we're trying to identify the logs from the standpoint are the top Sirait diameter or the bottoms right. Diameter. So this is almost this is a if you throw time in there, this is a four dimensional problem, but it is truly a three dimensional problem. What are the diameters of the logs true? Are they bent? Are they they're crooked. Are they're splits? Because all these affect how the wood goes into the mill and you don't want to damage the mill or slow the process down. Is there rot on the wood? Is it hard wood or softwood? Because this is how the mills are paying for this wood. And it's also critical, again, similar to the analogous to the automotive problem. You can't have these things escape and get into the next process because when they shut down the next process, it is an incredibly complex system, same as the other ones, though. It's drag and drop and all. You need somebody that knows what they're looking for in that timber to make that determination. They don't have to be computer. Scientists, they don't have to be PhD data scientists, you just have to be an expert in what you know you're looking for.

Tom Raftery:

And it's not a it's not a you don't supply the cameras. You just take the feed of images from whatever the camera source is and then run it through your system and spit out the results.

Chuck Myers:

Correct. We'll provide some professional service service guidance to the customers just because, you know, we're really focussed on being the gold standard for our industry and we want to make sure we put the right things in. But, you know, we have a team of really good integrators that we work with to educate them on what it is we need. And in the in the rail industry, we work with a couple of different providers that actually provide wayside cameras to the railroad industry where they've been looking at these image manually for years.

Tom Raftery:

Right. And is it just, again, practical perspective? Is it a cloud delivered solution? Is it an on premise that a hybrid is it up to you how you deploy it or how does that work?

Chuck Myers:

It's a really cloud focussed and I'll let Bill describe the architecture because it is traditional, full, industry 4.0 compliant. But there are things like in classified world where you want it on prem system. So I'll let Bill talk a little bit about that.

Bill Kish:

I would say it's the best described as a hybrid, a hybrid model. Some folks consume it as pure cloud, and that's certainly an option. A lot of cases in all cases, the centralised system, what we call Cloudfor, is used for training and managing your visual inspection tasks for your visual inspection pipelines. And then in many cases, that's augmented with our air flow, which is that that plug and play is appliance that we'll talk directly to the cameras out next to where the images are. That's that's really if you need highly available or latency predictions out at the edge, that edge flow is critical. If you can absorb a Internet latency or potentially the slightly lower availability associated with an excess surface, then then you can go with the pure model as well.

Tom Raftery:

OK, nice. And you know, that's all excellent where you are now. Where to from here? What's the next kind of three to five years looking like for you guys?

Chuck Myers:

Bill, you want to talk about it from a technical perspective, and...

Bill Kish:

Yes, so I think the the the goal here is to be able to automate real world operations across a number of mostly traditional industrial verticals. I think that's the where the the technology gap is largest in automated real world operations. The last decades have shown tremendous progress in what I call the traditional I.T. automation of the traditional I.T. tasks. Things that you can do in a data centre background is very well covered. Now, the sort of the field is out in the real world as real world operations that aren't going away, that are increasingly a source of inefficiencies. Those are the frontier. And vision is the tool that can be used across a really wide range of these verticals to operate optimised for real world operations. That's really what we want to do. We want to be able to optimise these real world operations across a variety of verticals. There's going to be some things like autonomous driving that's not really not that's not really in scale for us. There's those solutions are going to be vertically integrated. But there's a long tail of other industries that need the benefit of this technology that are the structures such that they need a platform like ours to really have a go at optimising the things that they that they care about in their world. So that's that's that's where it's going to end up here. There's going to be a lot of cameras looking at a lot of things and a lot of real world processes being optimised with these different techniques.

Chuck Myers:

So, you know, if you look at if you look at what's happened in the world and this is, you know, a little bit of kind of my thought process before I joined the company and Bill's vision. And this is one that was really taken aback by if you look at the traditional machine vision business today, it's really about it. It's not real. It is a 20 billion dollar, very profitable business. But it's really developing machine vision to be a go no go. Did I put the right label on the can at the food processing plant? Is this part round or is it square? Does it have a defect in it? And I push it off to the side and that's the end of it. But what's happened in the world today is especially industrial companies. But frankly, almost every company has been cut, become incredibly visual and for a lot of reasons, for regulatory reasons, for just, frankly, TQM reasons, you need to be able to maintain a workflow and a system of analytics. And what's my ground truth for my image data, whether it's manufacturing data, whether it's regulatory data, whatever it is. So if you think of, you know, a customer contact thing, if you think of Salesforce in general or something like that or an element of SAP' as being your ground truth for all your customer data and your contacts and how you relate with your customer. And if you think of an enterprise level of of SAP' or maybe one of the other companies as being the ground grounds for all your structured data, whether it's financial or H.R. or MRP or project management, that's really the ground truth for that. But what doesn't exist in the world today? What's my ground truth for my image? And we look at what did the image look like at that time? What's all of the metadata associated with it now? How can I run regressive analytics on it as I move forward? How do I run predictive analytics analytics on? Everybody's talking about data science and there's a lot of great data science tools out there from regular structured data. But where's the data say you have to have this these image data to run all the analytics on that? So we really look at we see the scope of our business is extremely large because we think we can solve that problem better than anybody else out there today because we don't require some act. An academic expert, we we provide the academic expertise in the software and we really think that's the big driver. So as you go forward, I mean, if you just take a 20 billion dollar industry, if you change that industry by 30 percent, you can imagine just in our business all of the sudden becomes a six billion dollar business. And it's the the traditional vision systems that are out there today are just absolutely ripe for this change. And it's craving this change because you want to go back and be able to find out, you know, what did I do wrong, what did I do right and where do I need to be in the future? Whether you're monitoring oil and gas well or you're looking at education problems in a city for a power company, are my power lines clean or are you looking for? Do you have growth for climate change? Do you have are your aeroplane turbines, right? But you need to have a place that you can actually manipulate this data and we really provide that whole system of workflow that you're able to do that inside of an operating entity.

Tom Raftery:

OK, super great. Great. Gentlemen, we've gone well over the 20 minutes, so it's time to kind of find this podcast and bring this baby home, as they say. Is there any question I've not asked that you think I should have asked or any topic we've not addressed that you think it's important for people to be aware of?

Bill Kish:

Yeah, I would just mention that in many cases, people think about this type of technology replacing human workers. But, you know, I would just, you know, provide the perspective that what we're doing here is super human level. These are things that we're just not even possible to do with people before. And it's really that greenfield opportunity that that ten or twenty five expansive of things that just weren't even possible before that you can do with this technology. Now, that is really the key driver.

Tom Raftery:

Perfect.

Chuck Myers:

Yeah, I would echo that very much. We get a lot of questions when we go into a new client about, well, can this solve a human problem? Can we take a human out of the loop? And in our answer around that is you don't you don't build your IOI around that. You build your ROIC around, all of the sudden you're doing things you never expected. What you thought was ten thousand dollars a month problem you realise is actually a two hundred thousand dollars a month capture that you never realised. And that's where we really see the aha moment in our customers. You, after it's been implemented for a couple of weeks, they're like, wow, we never realised these were issues. We can change this and we really see, you know, that the actual network leverage of what we do here.

Tom Raftery:

Excellent. Excellent. Chuck and Bill, if people want to know more about yourselves or about cognition or any of the things we've talked about in the last few minutes, where would you have me direct people?

Chuck Myers:

I would start with cogniac.co. It is dot co. We're working on that one still and maybe Cogniac.AI. And I can be reached either on LinkedIn it Chuck Myers or again through Chuck at Cogniac.co And Bill you can reach Bill at Bill at Cogniac.co.

Tom Raftery:

Fantastic. Gentlemen, that's been really interesting. Thanks a million for coming on the show today.

Chuck Myers:

Great. We really appreciate it. Tom,.

Bill Kish:

Yeah, thanks Tom.

Tom Raftery:

OK, we've come to the end of the show, thanks, everyone, for listening. If you'd like to know more about digital supply chain to head on over to SAP dot com slash digital supply chain or or simply drop me an email to Tom Dot Raftery at SAP dot com if you'd like to show, please don't forget to subscribe to it on your podcast application of choice to get new episodes as soon as they're published. Also, please don't forget to rate and review the podcast. It really does help new people to find the show. Thanks. Catch you all next time.