LightCTR:一个轻量级的CTR 预估机器学习开源框架

网友投稿 827 2022-11-01

LightCTR:一个轻量级的CTR 预估机器学习开源框架

LightCTR:一个轻量级的CTR 预估机器学习开源框架

LightCTR Overview

LightCTR is a lightweight and scalable framework that combines mainstream algorithms of Click-Through-Rate prediction based computational DAG, philosophy of Parameter Server and Ring-AllReduce collective communication. The library is suitable for sparse data and designed for large-scale distributed model training.

Meanwhile, LightCTR is also an undergoing experimental study and open source project that oriented to code readers. The clear execution logic will be of significance to leaners on the machine-learning related field.

Features

Distributed training based on Parameter Server and Ring-AllReduce collective communicationDirected Acyclic Graph(DAG) of autograd computationGradient clipping, stale synchronous parallel(SSP) and Asynchronous SGD with Delay compensationCompressing Neural Network with Half precision and Quantization(PQ or Int8)Shared parameters Key-Value pairs store in physical nodes by DHT in Shared memoryLock-free Multi-threaded training and SIMD operationsOptimizer implemented by Mini-Batch GD, Adagrad, FTRL, Adam, etc

List of Implemented Algorithms

Wide & Deep ModelFactorization Machine, Field-aware Factorization Machine, Neural Factorization MachineGradient Boosting Tree ModelGaussian Mixture Clustering ModelTopic Model PLSA, Embedding ModelNgram Convolution Neural Network, Self-Attention Recurrent Neural NetworkVariational AutoEncoderApproximate Nearest Neighbors Retrieval

Benchmark

High performance

Scalable

Introduction (zh)

用于群体发现

用于行为序列

用于内容分析

分层模型融合

多机多线程并行计算

LightCTR使用SIMD向量化指令、流水线并行、多核心多线程计算、Cache-aware等多重优化手段实现单机高性能数值计算,但当模型参数量超过单机内存容量、或单机训练效率达不到时效性要求时,LightCTR进一步提供了基于参数服务器与Ring-AllReduce的可扩展模型集群训练方案。

参数服务器模式下,集群分为Master, ParamServer与Worker三种角色;一个集群有一个Master负责集群启动与运行状态的维护,大规模模型稀疏与稠密Tensor参数以DHT散布在多个ParamServer上,与多个负责模型数据并行梯度运算的Worker协同,每轮训练都先从ParamServer拉取(Pull)一个样本Batch的参数,运算得到的参数梯度推送(Push)到ParamServer进行梯度汇总。ParamServer通过梯度TopK截断、延迟梯度补偿等手段,异步无锁、半同步的更新参数。参数在ParamServer上紧凑存储,按特征命中率进行优选与淘汰;使用变长编码/半精度/Int8的方式压缩梯度传输量,并用Batch化参数请求与读写分离的方法提升网络同步效率。环拓扑Ring-AllReduce模式下,LightCTR在不引入协调节点下实现集群训练进度的动态自平衡,结合梯度融合机制,实现了高效、高稳定性的去中心化梯度同步,适合稠密参数模型的可扩展训练;在这种模式下每个节点存储全量模型可单独提供推理预测能力,训练过程通过有限次迭代获取其他节点梯度结果,在一定集群规模与合适的Batch size、学习率设置下可实现训练任务的线性加速比。LightCTR分布式集群采取心跳监控、消息重传等Failover容错方式。此外,LightCTR也在探索RDMA、DPDK、多网卡等网络优化方式来降低网络通信延时。

Quick Start

Welcome to Contribute

Welcome everyone interested in intersection of machine learning and scalable systems to contribute code, create issues or pull requests.LightCTR is released under the Apache License, Version 2.0.

Disclaimer

Please note that LightCTR is still undergoing and it does not give any warranties, as to the suitability or usability.

版权声明:本文内容由网络用户投稿,版权归原作者所有,本站不拥有其著作权,亦不承担相应法律责任。如果您发现本站中有涉嫌抄袭或描述失实的内容,请联系我们jiasou666@gmail.com 处理,核实后本网站将在24小时内删除侵权内容。

上一篇:MegEngine 是基于计算图的深度神经网络学习框架
下一篇:DeepGBM:GBDT针对在线预测任务发布的深度学习框架
相关文章

 发表评论

暂时没有评论,来抢沙发吧~