Introduction to the Beta9 Project
The Beta9 Project, part of the Beam cloud platform, is designed to run serverless workloads efficiently on bare-metal servers located anywhere in the world. This allows it to offer fast cold starts and high-performance computing in a flexible, cross-cloud environment.
Features
-
Serverless Execution with Python: Beta9 provides a friendly Python interface for developers to run serverless workloads. By using decorators like
@endpoint
,@function
,@task_queue
, and@schedule
, developers can easily create scalable HTTP endpoints, long-running functions, task queues, and scheduled jobs. -
Autoscaling and Scale-to-Zero: The platform automatically scales workloads based on demand and seamlessly scales down to zero when not in use, optimizing resource utilization and cost efficiency.
-
Distributed Storage for Large Files: Beta9 supports reading large datasets efficiently by utilizing distributed, cross-region storage, enabling data processing at the edge.
-
Easy Cluster Integration: With just a single cURL command, users can connect their bare-metal nodes to the cluster, simplifying the integration process.
-
Advanced Management Using Tailscale: Manage your fleet of servers effortlessly with a service mesh powered by Tailscale, facilitating secure and efficient network management.
-
End-to-End Encryption with WireGuard: All workloads are secured through end-to-end encryption, ensuring data integrity and confidentiality across the cloud infrastructure.
How Does It Work?
Beta9 enables the deployment of serverless functions written in Python. By adding an endpoint
decorator, developers can transform their functions into secure and scalable HTTP services. The deployment process is simplified to a single command, making it accessible for developers of all experience levels.
For instance, the following code snippet showcases how to use Beta9's capabilities to deploy a mathematical operation:
from beta9 import endpoint
@endpoint(cpu=1, memory=128, gpu="A100-40")
def square(i: int):
return i**2
Deploying this function involves a straightforward command:
$ beta9 deploy app.py:square --name inference
Running on Bare-Metal Globally
Beta9 can connect any GPU to a cluster with minimal setup, allowing organizations to leverage powerful computing resources worldwide. A simple command generates an install script for connecting a virtual machine (VM) to the project’s cloud infrastructure:
$ beta9 machine create --pool lambda-a100-40
This feature allows for flexible scaling and efficient use of computational resources.
Managing Resources
Users can view and manage their entire distributed cluster using a centralized control plane, ensuring they have complete oversight over resource allocation and performance:
$ beta9 machine list
Local Installation
Beta9 can be set up locally or within an existing Kubernetes cluster using a Helm chart for easy deployment and management. The setup process for the server and SDK is streamlined through dedicated make
targets:
- Server Setup:
make setup
- SDK Setup:
make setup-sdk
These simplified procedures ensure that both the server and SDK can be quickly operational.
Contribution and Community Support
Beta9 welcomes community contributions and provides several avenues for support and engagement, including a dedicated Slack channel for real-time chat, GitHub for issue tracking, and Twitter for the latest updates.
Developers are encouraged to participate by submitting feature requests or bug reports and opening pull requests for new features or improvements.
Conclusion
The Beta9 Project stands out for its ability to offer a highly efficient, cross-cloud serverless engine that integrates seamlessly with modern bare-metal environments. With robust features and strong community support, Beta9 is crafted to meet the demanding needs of developers and enterprises seeking cloud-first solutions.