Hey there,
I’m currently experimenting with developing nodes with LLM functionality.
Is there a way to “tap into” what is already implemented in the existing python-based Gen AI extension?
E.g. I wanted to make my node “compatible” with the OpenAI Chat Model Connector Ports (I am working on a Prompter node) and I observed the following:
When including the relevant port definitions directly in my extension I get this error:
ValueError: There is already a port type with the provided object class '<class 'utils.ports.ChatModelPortObject'>' registered.
Is there a way to “point” my extension to the existing extensions Custom Port Object s? How do I import Custom Port Objects correctly?
I’ve followed this part of the documentation and have added “org.knime.python.features.llm” as a dependency in my knime.yml:
Feature dependencies: if your extension depends on another extension, you can specify it as a bullet point of
feature_dependencies
. Optionally, you can add a specific minimum version to it.Example: You use data types like
SmilesValue
of theKNIME Base Chemistry Types & Nodes
extension in your extension. You have that extension already installed and want to make sure that everybody who uses your extension will also have this extension installed. Then you can go to Help > About KNIME Analytics Platform > Installation Details and check the id ofKNIME Base Chemistry Types & Nodes
, which isorg.knime.features.chem.types.feature.group
. Take the id without.feature.group
and you have the string of the feature dependency:org.knime.features.chem.types
2 posts - 2 participants