Graphene image analysis by machine learning
Reference number | |
Coordinator | Stiftelsen Chalmers Industriteknik |
Funding from Vinnova | SEK 300 000 |
Project duration | March 2022 - November 2022 |
Status | Completed |
Venture | Strategic innovation program SIO Grafen |
Call | SIO Grafen: Collaboration on commercial applications with graphene - autumn 2021 |
Important results from the project
We aim to bring a high-throughput, low-cost, general imaging technique that allows accurate and quantitative evaluation of graphene flakes. The unsupervised method was applied to train the computer with dozens of optical images, to classify the thickness of graphene flakes. Together with the results from Raman, graphene flakes with few layers can be identified. It also finds out that at least 25% of the area from image should be covered by graphene flakes to extract useful information of different layers by present machine learning method.
Expected long term effects
With literature survey, well prepared samples and images taken from optical microscope, an unsupervised machine learning method was selected to classify the graphene flakes according to image pixels. The selected machine learning method is useful for grouping objections in an image, and it is also very sensitive to the change of optical microscope parameters. This project can be seen as an exploration step and in the future, it needs more advanced computer vision methods to provide a robust solution.
Approach and implementation
The pre-study provides the opportunity to test the concept of combining optical microscopy and machining learning for quality control of graphene. All the partners are quite devoted, and especially for graphene companies to show their desire and expectation to have a high-throughput characterization method for their up-scaling graphene product. We also got a fruitful suggestion from the companies on what their concern on quality control and learned the complexity of sample preparation and machine learning on large variant of graphene layers compared to the ones from literature.