GOSIM 2024 Conference kicks-off on May 6, 2024
SYNAPSE is a Medium Rare template.
Explore more great templates for Webflow, Framer + Figma.

GOSIM Schedule
may 6th

Schedule

AI & Agents

World leading researchers sharing insights on AI and Agents

9:00

Registration & Coffee

Registration & Coffee

9:40

Opening Keynote

Opening Keynote

Opening Keynote and Introduction

10:00

Fostering Responsible AI: Empowering Openness and Community Collaboration

Fostering Responsible AI: Empowering Openness and Community Collaboration

In the realm of Generative Artificial Intelligence (GenAI), the pursuit of responsible innovation hinges upon a steadfast dedication to openness and community collaboration. This talk illuminates the indispensable role of these principles in guiding GenAI development towards ethical and accountable outcomes. We delve into the transformative influence of openness and community collaboration on the very fabric of GenAI systems. Through the exchange of knowledge, diverse viewpoints, and collective wisdom, we navigate the landscape of building AI systems with transparency, thereby laying a solid groundwork for responsible AI practices. Central to our discourse is the introduction of the Model Openness Framework (MOF) crafted by Generative AI Commons within LF AI and Data. The MOF serves as a ranked classification system designed to evaluate all machine learning (ML) models, providing a structured approach to promote transparency and accountability in GenAI development. Join us on a journey to unlock the full potential of Generative AI through the empowerment of open collaboration and transparency. Together, we accelerate the process of building AI systems with openness and community involvement, fostering responsible innovation for the benefit of all.
Anni Lai

10:40

Challenges and Opportunities in Using Rust for AI

Challenges and Opportunities in Using Rust for AI

AI has played an increasingly significant role in our attempts to migrate code to safe Rust, by classifying unsafe code and translating simple programming projects. In addition to traditional Python-based solutions, we see benefit using Rust in these AI-based Rust programming tasks through the open-source projects such as `RustBERT` and `Candle`. Our AI4Rust and Rust4AI experiences call for accelerating the wheel of innovation between Rust and AI.
Yijun Yu

11:20

Morning Break

Morning Break

11:40

Unified Acceleration Framework of Both LLM and Generative AI Models on The Edge

Unified Acceleration Framework of Both LLM and Generative AI Models on The Edge

Dr. Wang will present Unified Acceleration Framework of Both LLM and Generative AI Models on The Edge
Yanzhi Wang

12:20

Write Once Run Anywhere, But for GPUs

Write Once Run Anywhere, But for GPUs

With the growing popularity of AI and Large Language Model (LLM) applications, there is an increasing demand for running and scaling these workloads in the cloud and on edge devices. However, the reliance on GPUs and hardware accelerators poses challenges for traditional container-based deployments. Rust and WebAssembly (Wasm) offer a solution by providing a portable bytecode format that abstracts hardware complexities. LlamaEdge is a lightweight, high-performance and cross-platform LLM inference runtime. Written in Rust and built on the WasmEdge runtime, LlamaEdge provides a standard API, known as WASI-NN, to LLM app developers. Developers only need to write against the API and compile their programs to Wasm bytecode. The Wasm bytecode file can run on any device, where WasmEdge translates and routes Wasm calls to the underlying native libraries such as llama.cpp, TensorRT LLM, MLX, PyTorch, Tensorflow, candle, and burn.rs. The result is very small and portable Wasm apps that run LLM inference at full native speed across many different devices. In this talk, we will discuss the design and implementation of LlamaEdge. We will demonstrate how it enables cross-platform LLM app development and deployment. We will also walk through several code examples from a basic sentence completion app, to a chat bot, to an RAG agent app with external knowledge in vector databases, to a Kubernetes managed app across a heterogeneous cluster of different GPUs and NPUs. The attendees will learn how to run open source LLMs on their own devices, and more importantly, how to create and deploy their customized LLM apps using LlamaEdge APIs.
Hydai Tai

13:00

Lunch

Lunch

14:30

dora-rs: LLM Powered Runtime Code Change in Robots

dora-rs: LLM Powered Runtime Code Change in Robots

In this talk, we show how allowing LLMs to modify robotic codebase at runtime enables new human-machine interaction. To achieve this, we use dora-rs, a robotic framework capable of changing code at runtime while keeping state, also known as hot-reloading. By pairing dora-rs with LLMs, we demonstrate that robots can be controlled and instructed with natural language to modify any aspect of the robot codebase. This approach allows new human-robot interactions that were previously inaccessible due to the limitations posed by the need to use existing predefined interfaces, thus paving the way to more sophisticated and wider use of robotic applications that can better understand and respond to human needs.
Xavier Tao

15:10

Moxin: A Pure Rust Explorer for Open Source LLMs

Moxin: A Pure Rust Explorer for Open Source LLMs

Moxin is an open source tool that makes it easy for users to explore and experiment with open source LLMs on their own computers. It is assembled from a collection of loosely coupled building blocks of Rust and Wasm components. * A Rust-native chatbot UI * A cross-platform LLM inference engine based on LlamaEdge. * A Rust native UI for managing, filtering, and displaying open source LLM models. * A database and web service for a catalog of open source LLMs. The database provides an admin dashboard for the community to update those models. * An in-process Rust messaging channel for the frontend widgets to communicate with the LlamaEdge model service and the model meta data service. In this talk, we will showcase the Moxin app, discuss its architecture, and demonstrate how it serves as a template and component library for Rust developers to create their own cross-platform LLM applications. We will also discuss the roadmap for this project, including timelines for the model database and plans for advanced features such as multimodal models and RAG apps. Attendees will learn how to build rich UI applications and LLM services using Rust and Wasm, and discover how Moxin is revolutionizing the way users interact with LLMs on their personal computers.
Jorge Bejar

16:00

Afternoon Break

Afternoon Break

16:20

Dioxus: AI driven UI

Dioxus: AI driven UI

Dioxus: AI driven UI
Jonathan Kelley

17:00

Social Hours

Social Hours

Schedule

App & Web

High performance cross-platform app & web development

9:00

Registration & Coffee

Registration & Coffee

9:40

Opening Keynote

Opening Keynote

Opening Keynote and Introduction

10:00

Rust Application Development White Paper

Rust Application Development White Paper

Discuss the state of Rust application development ecosystem, where we are now and where we are headed.
Nico Burns

10:40

OpenHarmony for Next Gen Mobile

OpenHarmony for Next Gen Mobile

To be announced.
Jonathan Schwender

11:20

Morning Break

Morning Break

11:40

Full Stack Rust With Leptos

Full Stack Rust With Leptos

Rust has proven to be a strong choice for backend web services, but new and upcoming frameworks like Leptos have made it a strong choice for building interactive frontend web UIs as well. Come learn why you might want to build a full stack Rust web app with Leptos, leveraging the power of Rust to deliver web apps rivaling any other web stack.
Ben Wishovich

12:20

Quake: Bridging the Build System Gap

Quake: Bridging the Build System Gap

The complexity of software has grown significantly over the past decades, outstripping the build systems that underly them. Modern software requires build-time support for asset handling, cross-platform compilation, and more, but this gap is most often filled by hacked together solutions that add only further technical debt. quake addresses this issue head-on, providing a simple but expressive interface over the Nushell language that allows any developer to construct robust, cross-platform build scripts without any magic incantations. Beyond quake itself, we'll dive deeper into other widely-used build systems and the unique challenges they face in order to better understand how we got where we are, and what the future could look like.
Cassaundra Smith

13:00

Lunch

Lunch

14:30

Wrapping Cargo for Shipping

Wrapping Cargo for Shipping

Follow along as we explore the road a Rust Application takes from source into application stores, onto customer devices. We will discuss how we can post-process Cargo artifacts to integrate them into Windows, macOS, mobile, and more. Followingly, we will attempt to please the Apple Store Validation, the Microsoft Audits, and the scrutiny of Linux Distributors, so our Rust Application will reach the distribution channels our users expect and deserve. At last, we will look at the Osiris Project, which is the home of the tooling we use and strives to document this process.
David Rheinsberg

15:10

Modular Servo: Three Paths Forward

Modular Servo: Three Paths Forward

From the start, about ten years ago, Servo was meant to be a modular web engine. What does this mean, Where do we stand today, and where are we going? Three distinct paths will be discussed: the embedding layer, independent projects using parts of Servo, and Servo integrating independent projects.
Gregory Terzian

16:00

Afternoon Break

Afternoon Break

17:00

Social Hours

Social Hours

9:00

Registration & Coffee

Registration & Coffee

9:40

Opening Keynote

Opening Keynote

Opening Keynote and Introduction

10:40

Entering the World of Fediverse: Messaging with Matrix

Entering the World of Fediverse: Messaging with Matrix

Matrix 2.0 is the next big evolution of Matrix - a set of new APIs which provide instant login, instant sync and instant launch; native support for OpenID Connect; native E2EE group VoIP and general performance improvements to ensure that Matrix-based communication apps can outperform the mainstream proprietary centralized alternatives. In this talk, we’ll explain how Matrix 2.0 is progressing, and how matrix-rust-sdk has become the flagship Matrix client SDK from the core team, using the safety and performance of Rust to provide a gold standard SDK implementation for the benefit of all.
Matthew Hodgson

11:20

Morning Break

Morning Break

11:40

Palpus - An Open-Sourced Rust Matrix Server

Palpus - An Open-Sourced Rust Matrix Server

The Privoce team, leveraging four years of experience in developing decentralized social platforms, is excited to announce the upcoming launch of a new project: a chat server written in Rust. This project is designed to support the Matrix protocol, with plans to extend compatibility to other social feed protocols and integrate self-hosted AI agents in the future. Drawing on our technical foundation, this Rust-based server aims to be efficient, lightweight, and reliable. It will incorporate the strengths of the Rust programming language and is engineered to enhance performance and security. This initiative reflects our commitment to pushing the boundaries of what's possible in decentralized communication technologies, focusing on user privacy, data security, and seamless interaction across different platforms."
Tom Zhu

12:20

Robrix: a Multi-Platform Matrix & Fediverse Hub

Robrix: a Multi-Platform Matrix & Fediverse Hub

Robrix is (currently) a new Matrix chat client application written in Rust to demonstrate and drive the featureset of Project Robius, a multi-platform app dev framework. Thanks to the efforts of the Robius software stack, and in particular the Makepad UI toolkit, Robrix runs seamlessly across Android, iOS, macOS, Linux, and Windows (with web and OpenHarmony to come), all without a single line of platform-specific code. This talk will cover the general architecture and features of Robrix, our experience developing apps in Rust and the challenges encountered therein, and how Robrix's needs have driven the development of ecosystem components. Finally, we'll lay out our future vision for Robrix as an open-source "hub" app, bringing together many aspects of the fediverse beyond Matrix chat: decentralized social networks, news aggregators and forums, code views for git hosts, and the integration of AI features via local LLMs.
Kevin Boos

13:00

Lunch

Lunch

14:30

Mega - Decentralized Open Source Collaboration for Source Code & LLM

Mega - Decentralized Open Source Collaboration for Source Code & LLM

Mega is a groundbreaking monorepo and monolithic codebase management system, particularly regarding source code and large language models (LLMs) management. Mega's decentralized network of services harnesses the Git and Git LFS protocols, thus fostering an inclusive development ecosystem and bolstering data integrity. Its integrative capabilities encompass advanced messaging protocols such as Matrix and Nostr, facilitating decentralized communication and collaboration. Mega revolutionizes open-source cooperation by offering a flexible, secure, and inclusive platform, empowering developers globally.
Quanyi Ma

15:10

Do You Know Who Wrote Your Software?

Do You Know Who Wrote Your Software?

In open source projects, it is quite common for software to consist of 100’s and 100’s of dependencies. This is a inherently a sign of proper collaboration in a healthy ecosystem. But in today’s world, it also poses a risk when we are seeking assurances about the pedigree of our software. As the thwarted attack on XZ Utils clearly shows, we need to pay more attention to our supply chain. In this presentation I want to raise awareness of this challenge from the perspective of a Rust developer, using the notions of the “burden problem” and the “trust problem”, and touching on various issues such as software distribution, version management, open source licensing and build time security.
Marc Schoolderman

16:00

Afternoon Break

Afternoon Break

16:20

Open Source Payment Orchestrator

Open Source Payment Orchestrator

Subject to change. To be announced.

17:00

Social Hours

Social Hours

Secure your seat at the frontier of Tech.