PASSCoDe: Parallel ASynchronous Stochastic dual Co-ordinate Descent
The Program
PASSCoDe implements the multi-core parallel DCD algorithms, which aims to solve large-scale linear Support Vector Machines.
Download
PASSCoDe is developed under the code base of LIBLINEAR with multi-core parallelization using OpenMP.
Download the zip-file and extract the files. On a UNIX system with GCC 4.7.2 or above, compile the program using the provided Makefile
> make
[Usage]: convert2binary training_set_file [training_binary]
For example, you can download datasets from here and convert it to the binary format using the following commands.
> ./convert2binary dataset dataset.cbin
See README.passcode for more details.
[Usage]: train-shrinking [options] training_set_file test_set_file
options:
-s type : set type of solver (default 31)
31 -- L2-regularized L2-loss support vector classification PASSCoDe-Wild (dual)
33 -- L2-regularized L1-loss support vector classification PASSCoDe-Wild (dual)
41 -- L2-regularized L2-loss support vector classification PASSCoDe-LOCK (dual)
43 -- L2-regularized L1-loss support vector classification PASSCoDe-LOCK (dual)
51 -- L2-regularized L2-loss support vector classification PASSCoDe-ATOMIC (dual)
53 -- L2-regularized L1-loss support vector classification PASSCoDe-ATOMIC (dual)
-c cost : set the parameter C (default 1)
-n nr_threads : the number of threads
-t max_iterations: the max number of iterations (default 100)
-b binary_mode : if binary_mode = 1, read binary format (default 1)
Please see REAEDE.passcode for more details.
Please acknowledge the use of the code with a
citation.
PASSCoDe:Parallel ASynchronous Stochastic dual Co-ordinate Descent.
Cho-Jui Hsieh, Hsiang-Fu Yu, and Inderjit S. Dhillon
International Conference of Machine Learning, 2015.
[paper,
slides]
@inproceedings{cjh15akk,
title ={{PASSCoDe}: {P}arallel {AS}ynchronous {S}tochastic dual {Co}-ordinate {De}scent},
author={Cho-Jui Hsieh and Hsiang-Fu Yu and Inderjit S. Dhillon},
booktitle = {International Conference of Machine Learning},
year = {2015}
}
Bug reports and comments are
always appreciated. We would like to know who showed interest in our
work, feel free to contact us.