Distributed Training
Last updated
Last updated
To build a truly decentralised NFT platform, we must first be able to create an artist with decentralised means or in a decentralised manner. AIRTIST introduces a Generative Adversarial Network mechanism on Blockchain to realise deep neural network training.
By using loosely coupled distributed training methods to accelerate the training process, the data is disassembled and distributed to several loosely coupled nodes (nodes located in different data centres or computing facilities). Each node has a unified network topology but uses its own data for training.
Each node regularly transmits the calculated gradients according to the pre-specified time intervals and updates their own network weights after partial aggregation. This data parallel training method can effectively improve the training speed. At the same time, AIRTIST plans to introduce AutoML technology in the future to optimise network topology based on distributed computing, thereby improving the overall network quality.