Checklists and LLM prompts for efficient and effective test creation in data analysis
-
Updated
Jun 12, 2024 - Jupyter Notebook
Checklists and LLM prompts for efficient and effective test creation in data analysis
Weaviate is an open-source vector database that stores both objects and vectors, allowing for the combination of vector search with structured filtering with the fault tolerance and scalability of a cloud-native database.
This is the repository of my study in MLOps Zoomcamp from DataTalksClub.
A high-throughput and memory-efficient inference and serving engine for LLMs
Scalable and flexible workflow orchestration platform that seamlessly unifies data, ML and analytics stacks.
✨ Writing about machine learning and deep learning content
Gilles de PERETTI's Portfolio, Machine Learning Engineer
Workflow Engine for Kubernetes
Substrate Python SDK
🔥 A tool for visualizing and tracking your machine learning experiments. This repo contains the CLI and Python API.
Qdrant - High-performance, massive-scale Vector Database for the next generation of AI. Also available in the cloud https://cloud.qdrant.io/
📜 you interest to mlops? here's your encyclopedic for dummies
cube studio开源云原生一站式机器学习/深度学习/大模型AI平台,支持sso登录,多租户,大数据平台对接,notebook在线开发,拖拉拽任务流pipeline编排,多机多卡分布式训练,超参搜索,推理服务VGPU,边缘计算,serverless,标注平台,自动化标注,数据集管理,大模型微调,vllm大模型推理,llmops,私有知识库,AI模型应用商店,支持模型一键开发/推理/微调,支持国产cpu/gpu/npu芯片,支持RDMA,支持pytorch/tf/mxnet/deepspeed/paddle/colossalai/horovod/spark/ray/volcano分布式
Label Studio is a multi-type data labeling and annotation tool with standardized output format
Modern columnar data format for ML and LLMs implemented in Rust. Convert from parquet in 2 lines of code for 100x faster random access, vector index, and data versioning. Compatible with Pandas, DuckDB, Polars, Pyarrow, with more integrations coming..
AI Observability & Evaluation
Apache Airflow - A platform to programmatically author, schedule, and monitor workflows
Friendli: the fastest serving engine for generative AI
🦋 A personal research and development (R&D) lab that facilitates the sharing of knowledge.
The easiest way to serve AI/ML models in production - Build Model Inference Service, LLM APIs, Multi-model Inference Graph/Pipelines, LLM/RAG apps, and more!
Add a description, image, and links to the mlops topic page so that developers can more easily learn about it.
To associate your repository with the mlops topic, visit your repo's landing page and select "manage topics."