头图

YoMo is a programming framework used to assist developers to facilitate the construction of a distributed cloud system (Geo-Distributed Cloud System). YoMo's communication layer is built on top of the QUIC protocol, which brings high-speed data transmission, and has a built-in Streaming Serverless "streaming function", which greatly improves the development experience of distributed cloud systems. The distributed cloud system built by YoMo provides an ultra-high-speed communication mechanism between near-field computing power and terminals, and has a wide range of application scenarios in Metaverse, VR/AR, IoT and other fields.

YoMo is written in Go language, and Streaming Serverless uses Golang plugins and shared libraries to dynamically load user code and shared libraries, but it also brings some limitations to developers, especially those using Windows. Coupled with the rigid demand for isolation of the Serverless architecture, this makes WebAssembly an excellent choice for running user-defined functions.

For example, in the process of real-time AI reasoning in AR/VR and smart factories, the camera can send real-time unstructured data to the computing node in the near-field MEC (multi-access edge computing) device through YoMo, and automatically execute the managed AI inference function. When the AI inference is completed, YoMo sends the AI calculation result to the end device in real time.

However, the challenge YoMo faces is to merge and manage handler functions written by multiple external developers in edge computing nodes. This requires runtime isolation of these functions without sacrificing performance. Traditional software container solutions, such as Docker, are not up to this task because they are too heavy and too slow to handle real-time tasks.

WebAssembly provides a lightweight and high-performance software container. It is very suitable as the runtime of YoMo data processing handler function.

In this article, we will show you how to create a Rust function for Tensorflow-based image recognition, compile it into WebAssembly, and then use YoMo to run it as a streaming data handler. We use WasmEdge as the WebAssembly runtime, because compared with other WebAssembly runtimes, WasmEdge provides the best performance and the highest flexibility. WasmEdge is the only WebAssembly virtual machine that stably supports Tensorflow. YoMo manages WasmEdge VM instances and WebAssembly bytecode applications in containers through WasmEdge's Golang API

GitHub source code: https://github.com/yomorun/yomo-wasmedge-tensorflow

Ready to work

Golang needs to be installed. We assume that you have already installed it.

The Golang version needs to be newer than 1.15 for our example to run.

At the same time, the YoMo CLI application needs to be installed. It arranges and coordinates data flow and handler function calls.

$ go install github.com/yomorun/cli/yomo@latest
$ yomo version
YoMo CLI version: v0.0.5

Next, please install WasmEdge and Tensorflow shared libraries. WasmEdge is the leading WebAssembly runtime, hosted by CNCF. We will use it to embed and run WebAssembly programs from YoMo.

# Install WasmEdge
$ wget https://github.com/second-state/WasmEdge-go/releases/download/v0.8.1/install_wasmedge.sh
$ chmod +x ./install_wasmedge.sh
$ sudo ./install_wasmedge.sh /usr/local

# Install WasmEdge Tensorflow extension
$ wget https://github.com/second-state/WasmEdge-go/releases/download/v0.8.1/install_wasmedge_tensorflow_deps.sh
$ wget https://github.com/second-state/WasmEdge-go/releases/download/v0.8.1/install_wasmedge_tensorflow.sh
$ chmod +x ./install_wasmedge_tensorflow_deps.sh
$ chmod +x ./install_wasmedge_tensorflow.sh
$ sudo ./install_wasmedge_tensorflow_deps.sh /usr/local
$ sudo ./install_wasmedge_tensorflow.sh /usr/local

# Install WasmEdge Images extension
$ wget https://github.com/second-state/WasmEdge-go/releases/download/v0.8.1/install_wasmedge_image_deps.sh
$ wget https://github.com/second-state/WasmEdge-go/releases/download/v0.8.1/install_wasmedge_image.sh
$ chmod +x ./install_wasmedge_image_deps.sh
$ chmod +x ./install_wasmedge_image.sh
$ sudo ./install_wasmedge_image_deps.sh /usr/local
$ sudo ./install_wasmedge_image.sh /usr/local

Finally, because our demo WebAssembly function is written in Rust, you also need to install the Rust compiler and rustwasmc tool chain.

Other parts of the demo can fork and clone source code repo .

$ git clone https://github.com/yomorun/yomo-wasmedge-tensorflow.git

Picture classification function

image recognition function processes the YoMo image stream, is written in Rust. It uses WasmEdge Tensorflow API to process input images.

#[wasm_bindgen]
pub fn infer(image_data: &[u8]) -> String {
    // Load the TFLite model and its meta data (the text label for each recognized object number)
    let model_data: &[u8] = include_bytes!("lite-model_aiy_vision_classifier_food_V1_1.tflite");
    let labels = include_str!("aiy_food_V1_labelmap.txt");

    // Pre-process the image to a format that can be used by this model
    let flat_img = wasmedge_tensorflow_interface::load_jpg_image_to_rgb8(image_data, 192, 192);
    
    // Run the TFLite model using the WasmEdge Tensorflow API
    let mut session = wasmedge_tensorflow_interface::Session::new(&model_data, wasmedge_tensorflow_interface::ModelType::TensorFlowLite);
    session.add_input("input", &flat_img, &[1, 192, 192, 3])
           .run();
    let res_vec: Vec<u8> = session.get_output("MobilenetV1/Predictions/Softmax");

    // Find the object index in res_vec that has the greatest probability
    // Translate the probability into a confidence level
    // Translate the object index into a label from the model meta data food_name
    
    ret_str = format!(
        "It {} a <a href='https://www.google.com/search?q={}'>{}</a> in the picture",
        confidence, food_name, food_name
    );
    return ret_str;
}

You can use the rustwasmc tool to compile this function into WebAssembly bytecode.

Here, we require the Rust compiler version to be 1.50 or earlier to allow WebAssembly functions to be used with WasmEdge's Golang API. Once the interface type specification is finalized and supported, we will catch up with the latest Rust compiler version .
$ rustup default 1.50.0

$ cd flow/rust_mobilenet_food
$ rustwasmc  build --enable-ext
# The output WASM will be pkg/rust_mobilenet_food_lib_bg.wasm.

# Copy the wasm bytecode file to the flow/ directory
$ cp pkg/rust_mobilenet_food_lib_bg.wasm ../

Integration with YoMo

On the YoMo side, we use the WasmEdge Golang API to start and run the WasmEdge virtual machine for the image recognition function. app.go file in the source code project is as follows:

package main

... ...

var (
    vm      *wasmedge.VM
    vmConf  *wasmedge.Configure
    counter uint64
)

func main() {
    // Initialize WasmEdge's VM
    initVM()
    defer vm.Delete()
    defer vmConf.Delete()

    // Connect to Zipper service
    cli, err := client.NewServerless("image-recognition").Connect("localhost", 9000)
    if err != nil {
        log.Print("❌ Connect to zipper failure: ", err)
        return
    }

    defer cli.Close()
    cli.Pipe(Handler)
}

// Handler process the data in the stream
func Handler(rxStream rx.RxStream) rx.RxStream {
    stream := rxStream.
        Subscribe(ImageDataKey).
        OnObserve(decode).
        Encode(0x11)
        
    return stream
}

// decode Decode and perform image recognition
var decode = func(v []byte) (interface{}, error) {
    // get image binary
    p, _, _, err := y3.DecodePrimitivePacket(v)
    if err != nil {
        return nil, err
    }
    img := p.ToBytes()

    // recognize the image
    res, err := vm.ExecuteBindgen("infer", wasmedge.Bindgen_return_array, img)
    
    return hash, nil
}

... ...

// initVM initialize WasmEdge's VM
func initVM() {
    wasmedge.SetLogErrorLevel()
    vmConf = wasmedge.NewConfigure(wasmedge.WASI)
    vm = wasmedge.NewVMWithConfig(vmConf)

    var wasi = vm.GetImportObject(wasmedge.WASI)
    wasi.InitWasi(
        os.Args[1:],     /// The args
        os.Environ(),    /// The envs
        []string{".:."}, /// The mapping directories
        []string{},      /// The preopens will be empty
    )

    /// Register WasmEdge-tensorflow and WasmEdge-image
    var tfobj = wasmedge.NewTensorflowImportObject()
    var tfliteobj = wasmedge.NewTensorflowLiteImportObject()
    vm.RegisterImport(tfobj)
    vm.RegisterImport(tfliteobj)
    var imgobj = wasmedge.NewImageImportObject()
    vm.RegisterImport(imgobj)

    /// Instantiate wasm
    vm.LoadWasmFile("rust_mobilenet_food_lib_bg.wasm")
    vm.Validate()
    vm.Instantiate()
}

run

Finally, we start YoMo and view the operation of the entire data processing pipeline. Start the YoMo CLI application from the project folder. yaml file defines the ports that YoMo should listen to and the workflow handler that triggers incoming data. Please note that the stream name image-recognition matches the data handler app.go mentioned above.

$ yomo serve -c ./zipper/workflow.yaml

Start the handler program by running the app.go

$ cd flow
$ go run --tags "tensorflow image" app.go

Start the analog data source by sending data to 16110a49fb12e3. A video is a series of picture frames. app.go in WasmEdge function will be called for each picture in the video frame.

# Download a video file
$ wget -P source 'https://github.com/yomorun/yomo-wasmedge-tensorflow/releases/download/v0.1.0/hot-dog.mp4'

# Stream the video to YoMo
$ go run ./source/main.go ./source/hot-dog.mp4

You can see the output of the WasmEdge handler function in the console. It will print the names of objects detected in each picture frame of the video.

Look to the future

This article discusses how to use the WasmEdge Tensorflow API and Golang SDK in the YoMo framework to process image streams in near real-time.

In cooperation with YoMo, we will soon deploy WasmEdge in the actual production of smart factories for various assembly line tasks. WasmEdge is a software runtime for edge computing!


WASM中文社区
169 声望162 粉丝