软件世界网 购物 网址 三丰软件 | 小说 美女秀 图库大全 游戏 笑话 | 下载 开发知识库 新闻 开发 图片素材
多播视频美女直播
↓电视,电影,美女直播,迅雷资源↓
TxT小说阅读器
↓语音阅读,小说下载,古典文学↓
一键清除垃圾
↓轻轻一点,清除系统垃圾↓
图片批量下载器
↓批量下载图片,美女图库↓
移动开发 架构设计 编程语言 Web前端 互联网
开发杂谈 系统运维 研发管理 数据库 云计算 Android开发资料
  软件世界网 -> 云计算 -> 十分钟搭建NeuralStyle服务 -> 正文阅读

[云计算]十分钟搭建NeuralStyle服务


Neural style 是让机器模仿已有画作的绘画风格来把一张图片重新绘制的算法。原始论文参考【1】。
下面将介绍如何搭建基于 MxNet 的 neural style 服务,在阿里云 HPC (https://www.aliyun.com/product/hpc)上部署时间不超过十分钟。


获取MxNet源码:


# git clone https://github.com/dmlc/mxnet.git --recursive
正克隆到 'mxnet'...
remote: Counting objects: 20971, done.
remote: Compressing objects: 100% (10/10), done.
remote: Total 20971 (delta 4), reused 2 (delta 2), pack-reused 20959
接收对象中: 100% (20971/20971), 5.67 MiB | 987.00 KiB/s, done.
处理 delta 中: 100% (12892/12892), done.
子模组 'dmlc-core' (https://github.com/dmlc/dmlc-core.git) 已为路径 'dmlc-core' 注册
子模组 'mshadow' (https://github.com/dmlc/mshadow.git) 已为路径 'mshadow' 注册
子模组 'ps-lite' (https://github.com/dmlc/ps-lite) 已为路径 'ps-lite' 注册
正克隆到 'dmlc-core'...
remote: Counting objects: 3503, done.
remote: Total 3503 (delta 0), reused 0 (delta 0), pack-reused 3503
接收对象中: 100% (3503/3503), 777.75 KiB | 144.00 KiB/s, done.
处理 delta 中: 100% (2075/2075), done.
子模组路径 'dmlc-core':检出 '0fb74229bc635946667f7dfd1c17116b37d0d870'
正克隆到 'mshadow'...
remote: Counting objects: 3566, done.
remote: Total 3566 (delta 0), reused 0 (delta 0), pack-reused 3566
接收对象中: 100% (3566/3566), 1.17 MiB | 243.00 KiB/s, done.
处理 delta 中: 100% (2450/2450), done.
子模组路径 'mshadow':检出 'f2df1886e43f114ae01a2384d77954acfede2aba'
正克隆到 'ps-lite'...
remote: Counting objects: 1687, done.
remote: Total 1687 (delta 0), reused 0 (delta 0), pack-reused 1687
接收对象中: 100% (1687/1687), 523.36 KiB | 148.00 KiB/s, done.
处理 delta 中: 100% (1066/1066), done.
子模组路径 'ps-lite':检出 '7faaeb73bcb9d68b464186d3191494aa95243703'

修改编译选项
编辑 make/config.mk



(1) 
# whether use CUDA during compile
USE_CUDA = 0
改为
# whether use CUDA during compile
USE_CUDA = 1

(2)
# add the path to CUDA library to link and compile flag
# if you have already add them to environment variable, leave it as NONE
# USE_CUDA_PATH = /usr/local/cuda
USE_CUDA_PATH = NONE
改为
# add the path to CUDA library to link and compile flag
# if you have already add them to environment variable, leave it as NONE
USE_CUDA_PATH = /usr/local/cuda
# USE_CUDA_PATH = NONE

(3)
# whether use CuDNN R3 library
USE_CUDNN = 0
改为
# whether use CuDNN R3 library
USE_CUDNN = 1

(4)
# choose the version of blas you want to use
# can be: mkl, blas, atlas, openblas
# in default use atlas for linux while apple for osx
UNAME_S := $(shell uname -s)
ifeq ($(UNAME_S), Darwin)
USE_BLAS = apple
else
USE_BLAS = atlas
Endif
改为
# choose the version of blas you want to use
# can be: mkl, blas, atlas, openblas
# in default use atlas for linux while apple for osx
UNAME_S := $(shell uname -s)
ifeq ($(UNAME_S), Darwin)
USE_BLAS = apple
else
USE_BLAS = openblas
Endif

编辑 Makefile
CFLAGS += -I$(ROOTDIR)/mshadow/ -I$(ROOTDIR)/dmlc-core/include -fPIC -Iinclude $(MSHADOW_CFLAGS)
LDFLAGS = -pthread $(MSHADOW_LDFLAGS) $(DMLC_LDFLAGS)
改为
CFLAGS += -I/disk1/deeplearning/local_install/include -I$(ROOTDIR)/mshadow/ -I$(ROOTDIR)/dmlc-core/include -fPIC -Iinclude $(MSHADOW_CFLAGS)
LDFLAGS = -L/disk1/deeplearning/local_install/lib -pthread $(MSHADOW_LDFLAGS) $(DMLC_LDFLAGS)

编译 MxNet :

# export PKG_CONFIG_PATH=/disk1/deeplearning/local_install/lib/pkgconfig/
# make –j
……

安装:

# cd python/
# python setup.py install
……

准备 Neural Style 模型和数据:

# cd example/neural-style/
# ls
download.sh  find_mxnet.py  model_vgg19.py  README.md  run.py
# ./download.sh
--2016-03-28 16:57:38--  https://github.com/dmlc/web-data/raw/master/mxnet/neural-style/model/vgg19.params
位置:https://raw.githubusercontent.com/dmlc/web-data/master/mxnet/neural-style/model/vgg19.params [跟随至新的 URL]
--2016-03-28 16:57:54--  https://raw.githubusercontent.com/dmlc/web-data/master/mxnet/neural-style/model/vgg19.params
长度:80099200 (76M) [application/octet-stream]
正在保存至: “vgg19.params”

100%[===================================================================================================================================================>] 80,099,200  13.1MB/s 用时 6.2s

2016-03-28 16:58:11 (12.2 MB/s) - 已保存 “vgg19.params” [80099200/80099200])

--2016-03-28 16:58:11--  https://github.com/dmlc/web-data/raw/master/mxnet/neural-style/input/IMG_4343.jpg
位置:https://raw.githubusercontent.com/dmlc/web-data/master/mxnet/neural-style/input/IMG_4343.jpg [跟随至新的 URL]
--2016-03-28 16:58:13--  https://raw.githubusercontent.com/dmlc/web-data/master/mxnet/neural-style/input/IMG_4343.jpg
长度:230594 (225K) [image/jpeg]
正在保存至: “IMG_4343.jpg”

100%[===================================================================================================================================================>] 230,594      902KB/s 用时 0.2s

2016-03-28 16:58:14 (902 KB/s) - 已保存 “IMG_4343.jpg” [230594/230594])

--2016-03-28 16:58:14--  https://github.com/dmlc/web-data/raw/master/mxnet/neural-style/input/starry_night.jpg
位置:https://raw.githubusercontent.com/dmlc/web-data/master/mxnet/neural-style/input/starry_night.jpg [跟随至新的 URL]
--2016-03-28 16:58:15--  https://raw.githubusercontent.com/dmlc/web-data/master/mxnet/neural-style/input/starry_night.jpg
长度:315236 (308K) [image/jpeg]
正在保存至: “starry_night.jpg”

100%[===================================================================================================================================================>] 315,236     1.25MB/s 用时 0.2s

2016-03-28 16:58:16 (1.25 MB/s) - 已保存 “starry_night.jpg” [315236/315236])

运行 Neural Style 例程:

# python run.py
INFO:root:load the content image, size = (1000, 1500)
INFO:root:resize the content image to (400, 600)
INFO:root:start training arguments Namespace(content_image='input/IMG_4343.jpg', content_weight=10, gpu=0, lr=0.1, max_long_edge=600, max_num_epochs=1000, model='vgg19', output='output/out.jpg', remove_noise=0.2, save_epochs=50, stop_eps=0.005, style_image='input/starry_night.jpg', style_weight=1)
INFO:root:epoch 0, relative change 0.991449
INFO:root:epoch 1, relative change 0.639205
INFO:root:epoch 2, relative change 0.478858
INFO:root:epoch 3, relative change 0.385878
INFO:root:epoch 4, relative change 0.321652
INFO:root:epoch 5, relative change 0.274459
INFO:root:epoch 6, relative change 0.234445
INFO:root:epoch 7, relative change 0.185482
INFO:root:epoch 8, relative change 0.149244
INFO:root:epoch 9, relative change 0.127406
INFO:root:epoch 10, relative change 0.113780
INFO:root:Update[11]: Change learning rate to 9.00000e-02
INFO:root:epoch 11, relative change 0.101375
INFO:root:epoch 12, relative change 0.090326
INFO:root:epoch 13, relative change 0.084095
INFO:root:epoch 14, relative change 0.083168
INFO:root:epoch 15, relative change 0.085777
INFO:root:epoch 16, relative change 0.089530
INFO:root:epoch 17, relative change 0.092034
INFO:root:epoch 18, relative change 0.090668
INFO:root:epoch 19, relative change 0.086283
INFO:root:epoch 20, relative change 0.081812
INFO:root:Update[21]: Change learning rate to 8.10000e-02
INFO:root:epoch 21, relative change 0.076982
INFO:root:epoch 22, relative change 0.071818
INFO:root:epoch 23, relative change 0.066535
INFO:root:epoch 24, relative change 0.061628
INFO:root:epoch 25, relative change 0.057347
INFO:root:epoch 26, relative change 0.054294
INFO:root:epoch 27, relative change 0.053362
INFO:root:epoch 28, relative change 0.054526
INFO:root:epoch 29, relative change 0.056334
INFO:root:epoch 30, relative change 0.057255
INFO:root:Update[31]: Change learning rate to 7.29000e-02
INFO:root:epoch 31, relative change 0.056267
INFO:root:epoch 32, relative change 0.054968
INFO:root:epoch 33, relative change 0.054624
INFO:root:epoch 34, relative change 0.053384
INFO:root:epoch 35, relative change 0.051284
INFO:root:epoch 36, relative change 0.048232
INFO:root:epoch 37, relative change 0.046385
INFO:root:epoch 38, relative change 0.044870
INFO:root:epoch 39, relative change 0.043447
INFO:root:epoch 40, relative change 0.042659
INFO:root:Update[41]: Change learning rate to 6.56100e-02
INFO:root:epoch 41, relative change 0.039969
INFO:root:epoch 42, relative change 0.038084
INFO:root:epoch 43, relative change 0.036276
INFO:root:epoch 44, relative change 0.034864
INFO:root:epoch 45, relative change 0.034026
INFO:root:epoch 46, relative change 0.034046
INFO:root:epoch 47, relative change 0.033529
INFO:root:epoch 48, relative change 0.032085
INFO:root:epoch 49, relative change 0.031398
INFO:root:save output to output/tmp_50.jpg
/disk1/deeplearning/anaconda2/lib/python2.7/site-packages/skimage/util/dtype.py:111: UserWarning: Possible precision loss when converting from float64 to uint8
  "%s to %s" % (dtypeobj_in, dtypeobj))
INFO:root:epoch 50, relative change 0.031347
INFO:root:Update[51]: Change learning rate to 5.90490e-02
INFO:root:epoch 51, relative change 0.031458
INFO:root:epoch 52, relative change 0.030541
INFO:root:epoch 53, relative change 0.028714
INFO:root:epoch 54, relative change 0.028467
INFO:root:epoch 55, relative change 0.027554
INFO:root:epoch 56, relative change 0.025826
INFO:root:epoch 57, relative change 0.026127
INFO:root:epoch 58, relative change 0.024360
INFO:root:epoch 59, relative change 0.024311
INFO:root:epoch 60, relative change 0.022046
INFO:root:Update[61]: Change learning rate to 5.31441e-02
INFO:root:epoch 61, relative change 0.022097
INFO:root:epoch 62, relative change 0.019917
INFO:root:epoch 63, relative change 0.020184
INFO:root:epoch 64, relative change 0.017810
INFO:root:epoch 65, relative change 0.017987
INFO:root:epoch 66, relative change 0.016383
INFO:root:epoch 67, relative change 0.017593
INFO:root:epoch 68, relative change 0.015544
INFO:root:epoch 69, relative change 0.017172
INFO:root:epoch 70, relative change 0.015113
INFO:root:Update[71]: Change learning rate to 4.78297e-02
INFO:root:epoch 71, relative change 0.015867
INFO:root:epoch 72, relative change 0.014007
INFO:root:epoch 73, relative change 0.014800
INFO:root:epoch 74, relative change 0.013104
INFO:root:epoch 75, relative change 0.014838
INFO:root:epoch 76, relative change 0.012413
INFO:root:epoch 77, relative change 0.013954
INFO:root:epoch 78, relative change 0.012503
INFO:root:epoch 79, relative change 0.014131
INFO:root:epoch 80, relative change 0.011837
INFO:root:Update[81]: Change learning rate to 4.30467e-02
INFO:root:epoch 81, relative change 0.012995
INFO:root:epoch 82, relative change 0.011472
INFO:root:epoch 83, relative change 0.012635
INFO:root:epoch 84, relative change 0.010827
INFO:root:epoch 85, relative change 0.012640
INFO:root:epoch 86, relative change 0.010638
INFO:root:epoch 87, relative change 0.012243
INFO:root:epoch 88, relative change 0.010504
INFO:root:epoch 89, relative change 0.011789
INFO:root:epoch 90, relative change 0.010193
INFO:root:Update[91]: Change learning rate to 3.87420e-02
INFO:root:epoch 91, relative change 0.011762
INFO:root:epoch 92, relative change 0.009751
INFO:root:epoch 93, relative change 0.010758
INFO:root:epoch 94, relative change 0.009456
INFO:root:epoch 95, relative change 0.011349
INFO:root:epoch 96, relative change 0.009183
INFO:root:epoch 97, relative change 0.010511
INFO:root:epoch 98, relative change 0.009244
INFO:root:epoch 99, relative change 0.010658
INFO:root:save output to output/tmp_100.jpg
INFO:root:epoch 100, relative change 0.008773
INFO:root:Update[101]: Change learning rate to 3.48678e-02
INFO:root:epoch 101, relative change 0.010101
INFO:root:epoch 102, relative change 0.008547
INFO:root:epoch 103, relative change 0.009856
INFO:root:epoch 104, relative change 0.008356
INFO:root:epoch 105, relative change 0.009952
INFO:root:epoch 106, relative change 0.008303
INFO:root:epoch 107, relative change 0.009713
INFO:root:epoch 108, relative change 0.008323
INFO:root:epoch 109, relative change 0.009423
^^INFO:root:epoch 110, relative change 0.008171
INFO:root:Update[111]: Change learning rate to 3.13811e-02
INFO:root:epoch 111, relative change 0.009585
INFO:root:epoch 112, relative change 0.008034
INFO:root:epoch 113, relative change 0.008760
INFO:root:epoch 114, relative change 0.007746
INFO:root:epoch 115, relative change 0.009353
INFO:root:epoch 116, relative change 0.007418
INFO:root:epoch 117, relative change 0.008406
INFO:root:epoch 118, relative change 0.007384
INFO:root:epoch 119, relative change 0.008828
INFO:root:epoch 120, relative change 0.007084
INFO:root:Update[121]: Change learning rate to 2.82430e-02
INFO:root:epoch 121, relative change 0.008017
INFO:root:epoch 122, relative change 0.006935
INFO:root:epoch 123, relative change 0.008137
INFO:root:epoch 124, relative change 0.006811
INFO:root:epoch 125, relative change 0.008026
INFO:root:epoch 126, relative change 0.006733
INFO:root:epoch 127, relative change 0.007884
INFO:root:epoch 128, relative change 0.006674
INFO:root:epoch 129, relative change 0.007826
INFO:root:epoch 130, relative change 0.006627
INFO:root:Update[131]: Change learning rate to 2.54187e-02
INFO:root:epoch 131, relative change 0.007637
INFO:root:epoch 132, relative change 0.006630
INFO:root:epoch 133, relative change 0.007475
INFO:root:epoch 134, relative change 0.006713
INFO:root:epoch 135, relative change 0.007901
INFO:root:epoch 136, relative change 0.006373
INFO:root:epoch 137, relative change 0.007069
INFO:root:epoch 138, relative change 0.006252
INFO:root:epoch 139, relative change 0.007505
INFO:root:epoch 140, relative change 0.006020
INFO:root:Update[141]: Change learning rate to 2.28768e-02
INFO:root:epoch 141, relative change 0.006644
INFO:root:epoch 142, relative change 0.005896
INFO:root:epoch 143, relative change 0.006905
INFO:root:epoch 144, relative change 0.005786
INFO:root:epoch 145, relative change 0.006633
INFO:root:epoch 146, relative change 0.005680
INFO:root:epoch 147, relative change 0.006628
INFO:root:epoch 148, relative change 0.005598
INFO:root:epoch 149, relative change 0.006543
INFO:root:save output to output/tmp_150.jpg
INFO:root:epoch 150, relative change 0.005588
INFO:root:Update[151]: Change learning rate to 2.05891e-02
INFO:root:epoch 151, relative change 0.006343
INFO:root:epoch 152, relative change 0.005581
INFO:root:epoch 153, relative change 0.006313
INFO:root:epoch 154, relative change 0.005680
INFO:root:epoch 155, relative change 0.006554
INFO:root:epoch 156, relative change 0.005409
INFO:root:epoch 157, relative change 0.005958
INFO:root:epoch 158, relative change 0.005302
INFO:root:epoch 159, relative change 0.006252
INFO:root:epoch 160, relative change 0.005130
INFO:root:Update[161]: Change learning rate to 1.85302e-02
INFO:root:epoch 161, relative change 0.005618
INFO:root:epoch 162, relative change 0.005035
INFO:root:epoch 163, relative change 0.005801
INFO:root:epoch 164, relative change 0.004944
INFO:root:eps < args.stop_eps, training finished
INFO:root:save output to output/out.jpg

查看结果


输入图像为:
[img]http://img.blog.csdn.net/20160403175154047?watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQv/font/5a6L5L2T/fontsize/400/fill/I0JBQkFCMA==/dissolve/70/gravity/SouthEast

风格图像为:
[img]http://img.blog.csdn.net/20160403175228001?watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQv/font/5a6L5L2T/fontsize/400/fill/I0JBQkFCMA==/dissolve/70/gravity/SouthEast

输出图像为:
[img]http://img.blog.csdn.net/20160403175257610?watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQv/font/5a6L5L2T/fontsize/400/fill/I0JBQkFCMA==/dissolve/70/gravity/SouthEast

运行一次图片风格转换时间为1-2 min。
上述demo只能在命令行下调用,你也可以搭一个简易的web server,支持用户上传下载图片,例如图片风格转换服务
立刻尝试在阿里云 HPC 上搭建你的服务!入口 
更多新的 demo 请关注阿里云 HPC 官方论坛
参考文献:
【1】 A Neural Algorithm of Artistic Style, arXiv:1508.06576v2
......显示全文...
    点击查看全文


上一篇文章      下一篇文章      查看所有文章
2016-04-03 20:46:27  
云计算 最新文章
CentOS7上安装Zabbix(快速安装监控工具Zab
十分钟搭建NeuralStyle服务
solr入门之拼写纠错深入研究及代码Demo
3个netty5的例子,简单介绍netty的用法
RedhatOpenshift云平台注册使用
Akka框架——第一节:并发编程简介
Hadoop实战:Linux报tmp磁盘存储不足
linux安装thrift
感觉快更快规划计划高考韩国
solr相似匹配
360图书馆 软件开发资料 文字转语音 购物精选 软件下载 美食菜谱 新闻资讯 电影视频 小游戏 Chinese Culture 股票 租车
生肖星座 三丰软件 视频 开发 短信 中国文化 网文精选 搜图网 美图 阅读网 多播 租车 短信 看图 日历 万年历 2017年11日历
2017-11-19 16:23:38
多播视频美女直播
↓电视,电影,美女直播,迅雷资源↓
TxT小说阅读器
↓语音阅读,小说下载,古典文学↓
一键清除垃圾
↓轻轻一点,清除系统垃圾↓
图片批量下载器
↓批量下载图片,美女图库↓
  网站联系: qq:121756557 email:121756557@qq.com  软件世界网 --