码迷,mamicode.com
首页 > 其他好文 > 详细

Bert镜像制作及flask生产环境模式启动

时间:2019-11-13 22:14:27      阅读:391      评论:0      收藏:0      [点我收藏+]

标签:dock   please   model   dev   info   red   生产环境   remote   output   

一天搞定两大技术点,成就满满。

一,dockerfile

FROM harbor.xxx.com.cn/3rd_part/tensorflow:1.14.0-gpu-py3-jupyter

LABEL "maintainer"="xxx4k"
LABEL "version"="1.0"

#COPY numpy-1.17.4-cp36-none-linux_x86_64.whl /tmp/
#COPY pyzmq-18.1.0-cp36-none-linux_x86_64.whl /tmp/

#RUN pip install /tmp/numpy-1.17.4-cp36-none-linux_x86_64.whl #    && pip install /tmp/pyzmq-18.1.0-cp36-none-linux_x86_64.whl RUN     pip install --no-cache-dir       -i http://xxx.com.cn/root/pypi/+simple/        --trusted-host xxx.com.cn       tensorflow==1.14.0 bert-base==0.0.9 flask flask_compress flask_cors  flask_json     && rm -rf /tmp/*     && rm -rf ~/.cache/pip     && echo "finished"

 

二,修改Http.py

参考URL:
https://www.jianshu.com/p/beab4df088df
https://blog.csdn.net/jusang486/article/details/82382358
https://blog.csdn.net/AbeBetter/article/details/77652457
https://blog.csdn.net/anh3000/article/details/83047027

如果在flask里,使用app.run()的模式,输出总会提示:

* Serving Flask app "bert_base.server.http" (lazy loading)
* Environment: production
WARNING: This is a development server. Do not use it in a production deployment.
Use a production WSGI server instead.
* Debug mode: off
I1113 02:44:05.755911 139926823548672 _internal.py:122] * Running on http://0.0.0.0:8091/ (Press CTRL+C to quit)

 

那如何改进呢?
可以选用nginx或是tornado。
如果是代码模式下,tornador是首选。

from multiprocessing import Process
from tornado.wsgi import WSGIContainer
from tornado.httpserver import HTTPServer
from tornado.ioloop import IOLoop
import asyncio
from termcolor import colored

from .helper import set_logger


class BertHTTPProxy(Process):
    def __init__(self, args):
        super().__init__()
        self.args = args

    def create_flask_app(self):
        try:
            from flask import Flask, request
            from flask_compress import Compress
            from flask_cors import CORS
            from flask_json import FlaskJSON, as_json, JsonError
            from bert_base.client import ConcurrentBertClient
        except ImportError:
            raise ImportError(BertClient or Flask or its dependencies are not fully installed, 
                              they are required for serving HTTP requests.
                              Please use "pip install -U bert-serving-server[http]" to install it.)

        # support up to 10 concurrent HTTP requests
        bc = ConcurrentBertClient(max_concurrency=self.args.http_max_connect,
                                  port=self.args.port, port_out=self.args.port_out,
                                  output_fmt=list, mode=self.args.mode)
        app = Flask(__name__)
        logger = set_logger(colored(PROXY, red))

        @app.route(/status/server, methods=[GET])
        @as_json
        def get_server_status():
            return bc.server_status

        @app.route(/status/client, methods=[GET])
        @as_json
        def get_client_status():
            return bc.status

        @app.route(/encode, methods=[POST])
        @as_json
        def encode_query():
            data = request.form if request.form else request.json
            try:
                logger.info(new request from %s % request.remote_addr)
                print(data)
                return {id: data[id],
                        result: bc.encode(data[texts], is_tokenized=bool(
                            data[is_tokenized]) if is_tokenized in data else False)}

            except Exception as e:
                logger.error(error when handling HTTP request, exc_info=True)
                raise JsonError(description=str(e), type=str(type(e).__name__))

        CORS(app, origins=self.args.cors)
        FlaskJSON(app)
        Compress().init_app(app)
        return app

    def run(self):
        app = self.create_flask_app()
        # app.run(port=self.args.http_port, threaded=True, host=‘0.0.0.0‘)
        # tornado 5 中引入asyncio.set_event_loop即可
        asyncio.set_event_loop(asyncio.new_event_loop())
        http_server = HTTPServer(WSGIContainer(app))
        http_server.listen(self.args.http_port)
        IOLoop.instance().start()

 

三,启动命令

bert-base-serving-start -bert_model_dir "/bert/outputFile_pb" -model_dir "/bert/outputFile_pb" -model_pb_dir "/bert/outputFile_pb" -mode CLASS -max_seq_len 64 -http_port 8091 -port 5575 -port_out 5576 

 

Bert镜像制作及flask生产环境模式启动

标签:dock   please   model   dev   info   red   生产环境   remote   output   

原文地址:https://www.cnblogs.com/aguncn/p/11853284.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!