X Tutup
Skip to content

Accept the configs parameter when creating a Chat. #360

@hartefisher

Description

@hartefisher

Is your feature request related to a problem? Please describe.
I cannot customize the model configs like temperature or top_p when creating a Chat, which prevents me from achieving better results with the Chat's generate_message function.

Describe the solution you'd like
The inference method of the Session in app.services.assistant.generation.session should not hardcode the configs as {}, but instead, it should accept the configs set by the Chat. Correspondingly, the Chat class should also include a configs field, and the related interfaces of the Python client should be modified accordingly.

Describe alternatives you've considered
Currently, I can only set configs using taskingai.inference.chat_completion. At the meantime, to maintain chat functionality, I would need to manually manage the history of messages.

Additional context
Add any other context or screenshots about the feature request here.

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions

    X Tutup