(For MindSpore 1.2 Only)
Build cloud inference-serving with MindSpore API on CMC training server
The detailed instructions are all listed in the documentation below — the content listed on this page is a condensed form from the doc.
Step 1: Downloading .whl
file directly
Step 2: Pip Install the wheel file
pip install mindspore_serving-1.3.0-cp37-cp37m-linux_aarch64.whl
On the CMC Apulis platform, log into your instance and change directory to:
/home/huaweiuser/userdata/serving/example/add
Project Directory Structure
The model in this example is a simple Adder network (hence the model name is ‘add
’)
<model_name>/servable_config.py
— Config file for servable services, this is where you register the APIs (method names) that you wish to support for your model here.master_with_worker.py
— Configuration for Master/Worker node for lightweight deployment, this is where you configure the IP and port number for your service.<aside>
💡 Only POST
method is supported with MindSporeServing. The POST-Request format is:
POST <http://$>{HOST}:${PORT}/model/${MODLE_NAME}[/version/${VERSION}]:${METHOD_NAME}
</aside>
Inside master_with_worker.py
— use the master.start_restful_server
API to start the RESTful
service. Using the Add
network as an example, the following arguments are: