Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🌿 Fern Regeneration -- February 21, 2024 #12

Merged
merged 1 commit into from
Feb 21, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion assemblyai.gemspec
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ require_relative "lib/gemconfig"

Gem::Specification.new do |spec|
spec.name = "assemblyai"
spec.version = "1.0.0-beta"
spec.version = "1.0.0-beta.2"
spec.authors = AssemblyAI::Gemconfig::AUTHORS
spec.email = AssemblyAI::Gemconfig::EMAIL
spec.summary = AssemblyAI::Gemconfig::SUMMARY
Expand Down
32 changes: 16 additions & 16 deletions lib/assemblyai/lemur/client.rb
Original file line number Diff line number Diff line change
Expand Up @@ -28,8 +28,8 @@ def initialize(request_client:)
# Use either transcript_ids or input_text as input into LeMUR.
# @param input_text [String] Custom formatted transcript data. Maximum size is the context limit of the selected model, which defaults to 100000.
# Use either transcript_ids or input_text as input into LeMUR.
# @param context [Hash] Context to provide the model. This can be a string or a free-form JSON value.Request of type Lemur::LemurBaseParamsContext, as a Hash
# @param final_model [LEMUR_MODEL] The model that is used for the final prompt after compression is performed.
# @param context [String, Hash{String => String}] Context to provide the model. This can be a string or a free-form JSON value.
# @param final_model [Lemur::LemurModel] The model that is used for the final prompt after compression is performed.
# Defaults to "default".
# @param max_output_size [Integer] Max output size in tokens, up to 4000
# @param temperature [Float] The temperature to use for the model.
Expand Down Expand Up @@ -66,8 +66,8 @@ def task(prompt:, transcript_ids: nil, input_text: nil, context: nil, final_mode
# Use either transcript_ids or input_text as input into LeMUR.
# @param input_text [String] Custom formatted transcript data. Maximum size is the context limit of the selected model, which defaults to 100000.
# Use either transcript_ids or input_text as input into LeMUR.
# @param context [Hash] Context to provide the model. This can be a string or a free-form JSON value.Request of type Lemur::LemurBaseParamsContext, as a Hash
# @param final_model [LEMUR_MODEL] The model that is used for the final prompt after compression is performed.
# @param context [String, Hash{String => String}] Context to provide the model. This can be a string or a free-form JSON value.
# @param final_model [Lemur::LemurModel] The model that is used for the final prompt after compression is performed.
# Defaults to "default".
# @param max_output_size [Integer] Max output size in tokens, up to 4000
# @param temperature [Float] The temperature to use for the model.
Expand Down Expand Up @@ -104,8 +104,8 @@ def summary(transcript_ids: nil, input_text: nil, context: nil, final_model: nil
# Use either transcript_ids or input_text as input into LeMUR.
# @param input_text [String] Custom formatted transcript data. Maximum size is the context limit of the selected model, which defaults to 100000.
# Use either transcript_ids or input_text as input into LeMUR.
# @param context [Hash] Context to provide the model. This can be a string or a free-form JSON value.Request of type Lemur::LemurBaseParamsContext, as a Hash
# @param final_model [LEMUR_MODEL] The model that is used for the final prompt after compression is performed.
# @param context [String, Hash{String => String}] Context to provide the model. This can be a string or a free-form JSON value.
# @param final_model [Lemur::LemurModel] The model that is used for the final prompt after compression is performed.
# Defaults to "default".
# @param max_output_size [Integer] Max output size in tokens, up to 4000
# @param temperature [Float] The temperature to use for the model.
Expand Down Expand Up @@ -146,8 +146,8 @@ def question_answer(questions:, transcript_ids: nil, input_text: nil, context: n
# Use either transcript_ids or input_text as input into LeMUR.
# @param input_text [String] Custom formatted transcript data. Maximum size is the context limit of the selected model, which defaults to 100000.
# Use either transcript_ids or input_text as input into LeMUR.
# @param context [Hash] Context to provide the model. This can be a string or a free-form JSON value.Request of type Lemur::LemurBaseParamsContext, as a Hash
# @param final_model [LEMUR_MODEL] The model that is used for the final prompt after compression is performed.
# @param context [String, Hash{String => String}] Context to provide the model. This can be a string or a free-form JSON value.
# @param final_model [Lemur::LemurModel] The model that is used for the final prompt after compression is performed.
# Defaults to "default".
# @param max_output_size [Integer] Max output size in tokens, up to 4000
# @param temperature [Float] The temperature to use for the model.
Expand Down Expand Up @@ -211,8 +211,8 @@ def initialize(request_client:)
# Use either transcript_ids or input_text as input into LeMUR.
# @param input_text [String] Custom formatted transcript data. Maximum size is the context limit of the selected model, which defaults to 100000.
# Use either transcript_ids or input_text as input into LeMUR.
# @param context [Hash] Context to provide the model. This can be a string or a free-form JSON value.Request of type Lemur::LemurBaseParamsContext, as a Hash
# @param final_model [LEMUR_MODEL] The model that is used for the final prompt after compression is performed.
# @param context [String, Hash{String => String}] Context to provide the model. This can be a string or a free-form JSON value.
# @param final_model [Lemur::LemurModel] The model that is used for the final prompt after compression is performed.
# Defaults to "default".
# @param max_output_size [Integer] Max output size in tokens, up to 4000
# @param temperature [Float] The temperature to use for the model.
Expand Down Expand Up @@ -251,8 +251,8 @@ def task(prompt:, transcript_ids: nil, input_text: nil, context: nil, final_mode
# Use either transcript_ids or input_text as input into LeMUR.
# @param input_text [String] Custom formatted transcript data. Maximum size is the context limit of the selected model, which defaults to 100000.
# Use either transcript_ids or input_text as input into LeMUR.
# @param context [Hash] Context to provide the model. This can be a string or a free-form JSON value.Request of type Lemur::LemurBaseParamsContext, as a Hash
# @param final_model [LEMUR_MODEL] The model that is used for the final prompt after compression is performed.
# @param context [String, Hash{String => String}] Context to provide the model. This can be a string or a free-form JSON value.
# @param final_model [Lemur::LemurModel] The model that is used for the final prompt after compression is performed.
# Defaults to "default".
# @param max_output_size [Integer] Max output size in tokens, up to 4000
# @param temperature [Float] The temperature to use for the model.
Expand Down Expand Up @@ -291,8 +291,8 @@ def summary(transcript_ids: nil, input_text: nil, context: nil, final_model: nil
# Use either transcript_ids or input_text as input into LeMUR.
# @param input_text [String] Custom formatted transcript data. Maximum size is the context limit of the selected model, which defaults to 100000.
# Use either transcript_ids or input_text as input into LeMUR.
# @param context [Hash] Context to provide the model. This can be a string or a free-form JSON value.Request of type Lemur::LemurBaseParamsContext, as a Hash
# @param final_model [LEMUR_MODEL] The model that is used for the final prompt after compression is performed.
# @param context [String, Hash{String => String}] Context to provide the model. This can be a string or a free-form JSON value.
# @param final_model [Lemur::LemurModel] The model that is used for the final prompt after compression is performed.
# Defaults to "default".
# @param max_output_size [Integer] Max output size in tokens, up to 4000
# @param temperature [Float] The temperature to use for the model.
Expand Down Expand Up @@ -335,8 +335,8 @@ def question_answer(questions:, transcript_ids: nil, input_text: nil, context: n
# Use either transcript_ids or input_text as input into LeMUR.
# @param input_text [String] Custom formatted transcript data. Maximum size is the context limit of the selected model, which defaults to 100000.
# Use either transcript_ids or input_text as input into LeMUR.
# @param context [Hash] Context to provide the model. This can be a string or a free-form JSON value.Request of type Lemur::LemurBaseParamsContext, as a Hash
# @param final_model [LEMUR_MODEL] The model that is used for the final prompt after compression is performed.
# @param context [String, Hash{String => String}] Context to provide the model. This can be a string or a free-form JSON value.
# @param final_model [Lemur::LemurModel] The model that is used for the final prompt after compression is performed.
# Defaults to "default".
# @param max_output_size [Integer] Max output size in tokens, up to 4000
# @param temperature [Float] The temperature to use for the model.
Expand Down
10 changes: 5 additions & 5 deletions lib/assemblyai/lemur/types/lemur_base_params.rb
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ class LemurBaseParams
# @param input_text [String] Custom formatted transcript data. Maximum size is the context limit of the selected model, which defaults to 100000.
# Use either transcript_ids or input_text as input into LeMUR.
# @param context [Lemur::LemurBaseParamsContext] Context to provide the model. This can be a string or a free-form JSON value.
# @param final_model [LEMUR_MODEL] The model that is used for the final prompt after compression is performed.
# @param final_model [Lemur::LemurModel] The model that is used for the final prompt after compression is performed.
# Defaults to "default".
# @param max_output_size [Integer] Max output size in tokens, up to 4000
# @param temperature [Float] The temperature to use for the model.
Expand All @@ -33,7 +33,7 @@ def initialize(transcript_ids: nil, input_text: nil, context: nil, final_model:
@input_text = input_text
# @type [Lemur::LemurBaseParamsContext] Context to provide the model. This can be a string or a free-form JSON value.
@context = context
# @type [LEMUR_MODEL] The model that is used for the final prompt after compression is performed.
# @type [Lemur::LemurModel] The model that is used for the final prompt after compression is performed.
# Defaults to "default".
@final_model = final_model
# @type [Integer] Max output size in tokens, up to 4000
Expand Down Expand Up @@ -61,7 +61,7 @@ def self.from_json(json_object:)
context = parsed_json["context"].to_json
context = Lemur::LemurBaseParamsContext.from_json(json_object: context)
end
final_model = Lemur::LEMUR_MODEL.key(parsed_json["final_model"]) || parsed_json["final_model"]
final_model = struct.final_model
max_output_size = struct.max_output_size
temperature = struct.temperature
new(transcript_ids: transcript_ids, input_text: input_text, context: context, final_model: final_model,
Expand All @@ -76,7 +76,7 @@ def to_json(*_args)
"transcript_ids": @transcript_ids,
"input_text": @input_text,
"context": @context,
"final_model": Lemur::LEMUR_MODEL[@final_model] || @final_model,
"final_model": @final_model,
"max_output_size": @max_output_size,
"temperature": @temperature
}.to_json
Expand All @@ -90,7 +90,7 @@ def self.validate_raw(obj:)
obj.transcript_ids&.is_a?(Array) != false || raise("Passed value for field obj.transcript_ids is not the expected type, validation failed.")
obj.input_text&.is_a?(String) != false || raise("Passed value for field obj.input_text is not the expected type, validation failed.")
obj.context.nil? || Lemur::LemurBaseParamsContext.validate_raw(obj: obj.context)
obj.final_model&.is_a?(Lemur::LEMUR_MODEL) != false || raise("Passed value for field obj.final_model is not the expected type, validation failed.")
obj.final_model&.is_a?(Lemur::LemurModel) != false || raise("Passed value for field obj.final_model is not the expected type, validation failed.")
obj.max_output_size&.is_a?(Integer) != false || raise("Passed value for field obj.max_output_size is not the expected type, validation failed.")
obj.temperature&.is_a?(Float) != false || raise("Passed value for field obj.temperature is not the expected type, validation failed.")
end
Expand Down
32 changes: 3 additions & 29 deletions lib/assemblyai/lemur/types/lemur_base_params_context.rb
Original file line number Diff line number Diff line change
Expand Up @@ -6,15 +6,6 @@ module AssemblyAI
class Lemur
# Context to provide the model. This can be a string or a free-form JSON value.
class LemurBaseParamsContext
attr_reader :member
alias kind_of? is_a?
# @param member [Object]
# @return [Lemur::LemurBaseParamsContext]
def initialize(member:)
# @type [Object]
@member = member
end

# Deserialize a JSON object to an instance of LemurBaseParamsContext
#
# @param json_object [JSON]
Expand All @@ -23,26 +14,17 @@ def self.from_json(json_object:)
struct = JSON.parse(json_object, object_class: OpenStruct)
begin
struct.is_a?(String) != false || raise("Passed value for field struct is not the expected type, validation failed.")
member = json_object
return new(member: member)
return json_object
rescue StandardError
# noop
end
begin
struct.is_a?(Hash) != false || raise("Passed value for field struct is not the expected type, validation failed.")
member = json_object
return new(member: member)
return json_object
rescue StandardError
# noop
end
new(member: struct)
end

# For Union Types, to_json functionality is delegated to the wrapped member.
#
# @return [JSON]
def to_json(*_args)
@member.to_json
struct
end

# Leveraged for Union-type generation, validate_raw attempts to parse the given hash and check each fields type against the current object's property definitions.
Expand All @@ -62,14 +44,6 @@ def self.validate_raw(obj:)
end
raise("Passed value matched no type within the union, validation failed.")
end

# For Union Types, is_a? functionality is delegated to the wrapped member.
#
# @param obj [Object]
# @return [Boolean]
def is_a?(obj)
@member.is_a?(obj)
end
end
end
end
14 changes: 7 additions & 7 deletions lib/assemblyai/lemur/types/lemur_model.rb
Original file line number Diff line number Diff line change
Expand Up @@ -2,12 +2,12 @@

module AssemblyAI
class Lemur
# @type [LEMUR_MODEL]
LEMUR_MODEL = {
default: "default",
basic: "basic",
assemblyai_mistral7b: "assemblyai/mistral-7b",
anthropic_claude2_1: "anthropic/claude-2-1"
}.freeze
# The model that is used for the final prompt after compression is performed.
class LemurModel
DEFAULT = "default"
BASIC = "basic"
ASSEMBLYAI_MISTRAL7B = "assemblyai/mistral-7b"
ANTHROPIC_CLAUDE2_1 = "anthropic/claude-2-1"
end
Swimburger marked this conversation as resolved.
Show resolved Hide resolved
end
end
32 changes: 3 additions & 29 deletions lib/assemblyai/lemur/types/lemur_question_context.rb
Original file line number Diff line number Diff line change
Expand Up @@ -6,15 +6,6 @@ module AssemblyAI
class Lemur
# Any context about the transcripts you wish to provide. This can be a string or any object.
class LemurQuestionContext
attr_reader :member
alias kind_of? is_a?
# @param member [Object]
# @return [Lemur::LemurQuestionContext]
def initialize(member:)
# @type [Object]
@member = member
end

# Deserialize a JSON object to an instance of LemurQuestionContext
#
# @param json_object [JSON]
Expand All @@ -23,26 +14,17 @@ def self.from_json(json_object:)
struct = JSON.parse(json_object, object_class: OpenStruct)
begin
struct.is_a?(String) != false || raise("Passed value for field struct is not the expected type, validation failed.")
member = json_object
return new(member: member)
return json_object
rescue StandardError
# noop
end
begin
struct.is_a?(Hash) != false || raise("Passed value for field struct is not the expected type, validation failed.")
member = json_object
return new(member: member)
return json_object
rescue StandardError
# noop
end
new(member: struct)
end

# For Union Types, to_json functionality is delegated to the wrapped member.
#
# @return [JSON]
def to_json(*_args)
@member.to_json
struct
end

# Leveraged for Union-type generation, validate_raw attempts to parse the given hash and check each fields type against the current object's property definitions.
Expand All @@ -62,14 +44,6 @@ def self.validate_raw(obj:)
end
raise("Passed value matched no type within the union, validation failed.")
end

# For Union Types, is_a? functionality is delegated to the wrapped member.
#
# @param obj [Object]
# @return [Boolean]
def is_a?(obj)
@member.is_a?(obj)
end
end
end
end
7 changes: 5 additions & 2 deletions lib/assemblyai/realtime/types/audio_encoding.rb
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,10 @@

module AssemblyAI
class Realtime
# @type [AUDIO_ENCODING]
AUDIO_ENCODING = { pcm_s16le: "pcm_s16le", pcm_mulaw: "pcm_mulaw" }.freeze
# The encoding of the audio data
class AudioEncoding
PCM_S16LE = "pcm_s16le"
PCM_MULAW = "pcm_mulaw"
end
end
end
13 changes: 6 additions & 7 deletions lib/assemblyai/realtime/types/message_type.rb
Original file line number Diff line number Diff line change
Expand Up @@ -2,12 +2,11 @@

module AssemblyAI
class Realtime
# @type [MESSAGE_TYPE]
MESSAGE_TYPE = {
session_begins: "SessionBegins",
partial_transcript: "PartialTranscript",
final_transcript: "FinalTranscript",
session_terminated: "SessionTerminated"
}.freeze
class MessageType
SESSION_BEGINS = "SessionBegins"
PARTIAL_TRANSCRIPT = "PartialTranscript"
FINAL_TRANSCRIPT = "FinalTranscript"
SESSION_TERMINATED = "SessionTerminated"
end
end
end
12 changes: 6 additions & 6 deletions lib/assemblyai/realtime/types/realtime_base_message.rb
Original file line number Diff line number Diff line change
Expand Up @@ -8,11 +8,11 @@ class Realtime
class RealtimeBaseMessage
attr_reader :message_type, :additional_properties

# @param message_type [MESSAGE_TYPE] Describes the type of the message
# @param message_type [Realtime::MessageType] Describes the type of the message
# @param additional_properties [OpenStruct] Additional properties unmapped to the current class definition
# @return [Realtime::RealtimeBaseMessage]
def initialize(message_type:, additional_properties: nil)
# @type [MESSAGE_TYPE] Describes the type of the message
# @type [Realtime::MessageType] Describes the type of the message
@message_type = message_type
# @type [OpenStruct] Additional properties unmapped to the current class definition
@additional_properties = additional_properties
Expand All @@ -24,24 +24,24 @@ def initialize(message_type:, additional_properties: nil)
# @return [Realtime::RealtimeBaseMessage]
def self.from_json(json_object:)
struct = JSON.parse(json_object, object_class: OpenStruct)
parsed_json = JSON.parse(json_object)
message_type = Realtime::MESSAGE_TYPE.key(parsed_json["message_type"]) || parsed_json["message_type"]
JSON.parse(json_object)
message_type = struct.message_type
new(message_type: message_type, additional_properties: struct)
end

# Serialize an instance of RealtimeBaseMessage to a JSON object
#
# @return [JSON]
def to_json(*_args)
{ "message_type": Realtime::MESSAGE_TYPE[@message_type] || @message_type }.to_json
{ "message_type": @message_type }.to_json
end

# Leveraged for Union-type generation, validate_raw attempts to parse the given hash and check each fields type against the current object's property definitions.
#
# @param obj [Object]
# @return [Void]
def self.validate_raw(obj:)
obj.message_type.is_a?(Realtime::MESSAGE_TYPE) != false || raise("Passed value for field obj.message_type is not the expected type, validation failed.")
obj.message_type.is_a?(Realtime::MessageType) != false || raise("Passed value for field obj.message_type is not the expected type, validation failed.")
end
end
end
Expand Down
Loading