Designed for developers
dllib is designed to make implementing new optimizer or network layers easy for developers. It is kept simple and easy to read.
dllib is a simple distributed deep learning framework running on Apache Spark. It aims to enable user or developer to run deep learning algorithms easily on Apache Spark.
dllib is designed to make implementing new optimizer or network layers easy for developers. It is kept simple and easy to read.
You can use dllib with only one command line if you have already Spark cluster. The package can be installed via Spark Packages
All codebase written in Scala programming language. So you can extend the algorithms with JVM languages.
Installing via Spark Packages
$ ./bin/spark-shell --packages Lewuathe:dllib:0.0.8
dllib is distributed under Apache License v2. See more detail in LICENSE
dllib will help you to understand deep learning algorithms on Spark.
Feel free to get in touch if you have any questions or suggestions.