Skip to content

alpaca-core/ac-local

Repository files navigation

Alpaca Core Local SDK

License Standard Build

Note

This project is still in in an alpha stage of development. Significant changes are very likely and backwards compatibility is disregarded.

The Alpaca Core Local SDK, or AC Local for short, is a multi-platform SDK for local AI Inference.

"Local" here means running on the device which executes the code. This could be a server, a desktop, or a mobile device.

It provides a unified API for doing inference with multiple models. The API itself can be split into two layers:

  • Programming language specific (Language API): The API which one calls writing code in a specific programming language. It's just a means to call the:
  • Inference API: A JSON/CBOR/POJO-like API which is used to communicate with the underlying inference engines following their specific API schema.

Read the full introduction here.

Supported models

The SDK on its own does not support any models. It contains the tools for building and loading plugins which provide inference for specific models.

Some libraries which have AC Local Plugins include:

This list will be updated as new models are added.

Bindings, Wrappers, and Integrations

This repo contains the Inference SDK implementation and Inference API documentation. The Inference SDK is implemented in C++, and thus the C++ Language API and its documentation are also hosted here. Additionally there are bindings, wrappers, and integrations for other languages and platforms. Their documentation is hosted in, and accessible from their respective repositories:

Minimal Example

ac::local::Lib::loadAllPlugins();

auto model = ac::local::Lib::createModel(
    {
        .type = "llama.cpp gguf",
        .assets = {
            {.path = "/path/to/model.gguf"}
        }
    }, 
    { /*default params*/ }
);

auto instance = model->createInstance("general", { /*default params*/ });

auto result = instance->runOp("run", {{"prompt", "If you could travel faster than light,"}});

std::cout << result << "\n";

Demos

Most inference libraries with AC Local plugins have simple examples in their respective repositories. Additionally we have some standalone demos:

Usage

Check out the guide on getting started.

Contributing

Check out the contributing guide.

License

License

This software is distributed under the MIT Software License. See accompanying file LICENSE or copy here.

Copyright © 2024 Alpaca Core, Inc

Third Party Libraries

A list of the third party libraries used here. Please consider supporting them.

Additionally, if you deploy this software as binary, please include etc/ac-local-deploy-licenses.md in your deployment.