Skip to main content

DiffSharp: An AD Library for .NET Languages

Atılım Güneş Baydin‚ Barak A. Pearlmutter and Jeffrey Mark Siskind

Abstract

DiffSharp is an algorithmic differentiation (AD) library for the .NET ecosystem, which is targeted by the C# and F# languages, among others. The library has been designed with machine learning applications in mind \citepBaydin2015b, allowing very succinct implementations of models and optimization routines. DiffSharp is implemented in F# and exposes forward and reverse AD operators as general nestable higher-order functions, usable by any .NET language. It provides high-performance linear algebra primitives—scalars, vectors, and matrices, with a generalization to tensors underway—that are fully supported by all the AD operators, and which use a BLAS/LAPACK backend via the highly optimized OpenBLAS library. DiffSharp currently uses operator overloading, but we are developing a transformation-based version of the library using F#'s ``code quotation'' metaprogramming facility \citepSyme2006. Work on a CUDA-based GPU backend is also underway.

Book Title
7th International Conference on Algorithmic Differentiation‚ Christ Church Oxford‚ UK‚ September 12–15‚ 2016
Year
2016