Skip to content

Streaming with Funcchain

Example

See stream.py

This serves as an example of how to implement streaming output for text generation tasks using funcchain.

Full Code Example


from funcchain import chain, settings
from funcchain.backend.streaming import stream_to

settings.temperature = 1

def generate_story_of(topic: str) -> str:
    """
    Write a short story based on the topic.
    """
    return chain()

with stream_to(print):
    generate_story_of("a space cat")

Demo

with stream_to(print):
    generate_story_of("a space cat")

$ Once upon a time in a galaxy far, far away, there was a space cat named Whiskertron...

Instructions

Step-by-step

Necessary Imports

from funcchain import chain, settings
from funcchain.backend.streaming import stream_to

Configure Settings

The settings are configured to set the temperature, which controls the creativity of the language model's output. Experiment with different values.

settings.temperature = 1

Define the Story Generation Function

The generate_story_of function is designed to take a topic and use the chain function to generate a story.

def generate_story_of(topic: str) -> str:
    """
    Write a short story based on the topic.
    """
    return chain()

Execute the Streaming Generation

This block uses the stream_to context manager to print the output of the story generation function as it is being streamed. This is how you stream the story while it is being generated.

with stream_to(print):
    generate_story_of("a space cat")