Deep learning material sensing platform augments laser cutting

Share this on social media:

The SensiCut smart sensing platform distinguishes between visually similar materials for safe use. (Image: MIT CSAIL)

MIT scientists have developed a smart material sensing platform for laser cutters powered by deep learning.

The platform could protect users from hazardous waste, provide material-specific knowledge, suggest subtle cutting adjustments for better results, and enable the engraving of items consisting of multiple materials (such as garments or phone cases).

Laser cutters can be used to process a variety of materials including metals, woods, papers, and plastics. However, users can face difficulties distinguishing between stockpiles of visually similar materials, which can result in the wrong one being processed. This can lead to gooey messes, horrendous odors, and the release of harmful chemicals.

The scientists, from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), have therefore developed ‘SensiCut’, a smart material-sensing platform for laser cutters. 

Conventional, camera-based identification approaches have been known to misidentify materials, while sticker tags (like QR codes) used to identify individual sheets can be accidentally cut off during processing. Also, if an incorrect tag is attached, the laser cutter will assume the wrong material type. 

SensiCut on the other hand uses a more nuanced fusion. It identifies materials using deep learning and an optical method called ‘speckle sensing’, which uses a laser to sense a surface’s microstructure.

'By augmenting standard laser cutters with lensless image sensors, we can easily identify visually similar materials commonly found in workshops and reduce overall waste,' said Mustafa Doga Dogan, PhD candidate at MIT CSAIL. 'We do this by leveraging a material’s micron-level surface structure, which is a unique characteristic even when visually similar to another type. Without that, you’d likely have to make an educated guess on the correct material name from a large database.' 

The team trained SensiCut’s deep neural network on images of 30 different material types of over 38,000 images, where it could then differentiate between things like acrylic, foamboard, and styrene, and even provide further guidance on power and speed settings.

In one experiment, the team built a face shield, which required distinguishing between transparent materials from a workshop. The user selected a design file in the interface and used a 'pinpoint' function to get the laser moving to identify the material type at a point on the sheet. The laser interacted with the very tiny features of the surface and reflected off it, arriving at the pixels of an image sensor to produce a unique 2D image. The system could then alert or flag the user that their sheet was polycarbonate, which would release potentially highly toxic flames if cut by a laser. 

The speckle imaging technique was used inside a laser cutter with low-cost, off-the shelf-components, such as a Raspberry Pi Zero microprocessor board. To make it compact, the team designed and 3D printed a lightweight mechanical housing. The researchers plan to present their work at the ACM Symposium on User Interface Software and Technology (UIST) in October.

Beyond laser cutters, they envision a future where SensiCut’s sensing technology could eventually be integrated into other fabrication tools such as 3D printers. To capture additional nuances, they also plan to extend the system by adding thickness detection, a pertinent variable in material makeup.

More on how artificial intelligence, deep learning and machine learning are being adopted into laser processing