AUTHOR=Mittal Anshul , Aggarwal Swati TITLE=Hyperparameter Optimization Using Sustainable Proof of Work in Blockchain JOURNAL=Frontiers in Blockchain VOLUME=3 YEAR=2020 URL=https://www.frontiersin.org/journals/blockchain/articles/10.3389/fbloc.2020.00023 DOI=10.3389/fbloc.2020.00023 ISSN=2624-7852 ABSTRACT=

Hyperparameters are pivotal for machine learning models. The success of efficient calibration, often surpasses the results obtained by devising new approaches. Traditionally, human intervention is required to tune the models, however, this obtuse outlook restricts the proficiency and competence. Automating this crucial characteristic of learning sustainably, proffers a significant boost in performance and cost optimization. Blockchain technology has revolutionized industries utilizing its Proof-of-Work algorithms for consensus. This complicated solution generates a lot of useless computations across the nodes attached to the network and thus, fritters away a huge amount of precious energy. In this paper, we propose to exploit these inane computations for training deep learning models instead of calculating purposeless hash values, thus, suggesting a new consensus schema. This work distinguishes itself from other related works by capitalizing on the parallel processing prospects it generates for hyperparameter tuning of complex deep learning models. We address this aspect through the framework of Bayesian optimization which is an effective methodology for the global optimization of functions with expensive evaluations. We call our work, Proof of Deep Learning with Hyperparameter Optimization (PoDLwHO).