ChatGPT API is currently supported, click here for the implementation introductions.
A message from creator,
Thank you for visiting the @orhanerday/open-ai repository! If you find this repository helpful or useful, we encourage you to star it
on GitHub. Starring a repository is a way to show your support for the project. It also helps to increase the visibility
of the project and to let the community know that it is valuable. Thanks again for your support and we hope you find the
repository useful!
Orhan
Project Name | Required PHP Version (Lower is better) | Description | Type (Official / Community) | Support |
---|---|---|---|---|
orhanerday/open-ai | PHP 7.4+ | Most downloaded, forked, contributed, huge community supported, and used PHP SDK for OpenAI GPT-3 and DALL-E. It also supports chatGPT-like streaming. | Community | Available, (Community driven Discord Server or personal mail [email protected]) |
openai-** /c***t | PHP 8.1+ | OpenAI PHP API client. | Community | - |
Fully open-source and secure community-maintained, PHP SDK for accessing the OpenAI GPT-3 API.
For more information, you can read laravel news blog post.
Free support is available. Join our discord server
To get started with this package, you'll first want to be familiar with the OpenAI API documentation and examples. Also you can get help from our discord channel that called #api-support
- orhanerday/open-ai added to community libraries php section.
- orhanerday/open-ai featured on PHPStorm blog post, thanks JetBrains!
Requires PHP 7.4+
Click here to join the Discord server
As you may know, OpenAI PHP is an open-source project wrapping tool for OpenAI. We rely on the support of our community to continue developing and maintaining the project, and one way that you can help is by making a donation.
Donations allow us to cover expenses such as hosting costs(for testing), development tools, and other resources that are necessary to keep the project running smoothly. Every contribution, no matter how small, helps us to continue improving OpenAI PHP for everyone.
If you have benefited from using OpenAI PHP and would like to support its continued development, we would greatly appreciate a donation of any amount. You can make a donation through;
Thank you for considering a donation to Orhanerday/OpenAI PHP SDK. Your support is greatly appreciated and helps to ensure that the project can continue to grow and improve.
Sincerely,
Orhan Erday / Creator.
Please visit https://orhanerday.gitbook.io/openai-php-api-1/
- Chat
- Models
- Completions
- Edits
- Images
- Embeddings
- Audio
- Files
- Fine-tunes
- Moderation
Engines(deprecated)- Assistants (beta)
- Threads (beta)
- Messages (beta)
- Runs (beta)
You can install the package via composer:
composer require orhanerday/open-ai
Before you get starting, you should set OPENAI_API_KEY as ENV key, and set OpenAI key as env value with the following commands;
Powershell
$Env:OPENAI_API_KEY = "sk-gjtv....."
Cmd
set OPENAI_API_KEY=sk-gjtv.....
Linux or macOS
export OPENAI_API_KEY=sk-gjtv.....
Getting issues while setting up env? Please read the article or you can check my StackOverflow answer for the Windows® ENV setup.
Create your index.php
file and paste the following code part into the file.
<?php
require __DIR__ . '/vendor/autoload.php'; // remove this line if you use a PHP Framework.
use Orhanerday\OpenAi\OpenAi;
$open_ai_key = getenv('OPENAI_API_KEY');
$open_ai = new OpenAi($open_ai_key);
$chat = $open_ai->chat([
'model' => 'gpt-3.5-turbo',
'messages' => [
[
"role" => "system",
"content" => "You are a helpful assistant."
],
[
"role" => "user",
"content" => "Who won the world series in 2020?"
],
[
"role" => "assistant",
"content" => "The Los Angeles Dodgers won the World Series in 2020."
],
[
"role" => "user",
"content" => "Where was it played?"
],
],
'temperature' => 1.0,
'max_tokens' => 4000,
'frequency_penalty' => 0,
'presence_penalty' => 0,
]);
var_dump($chat);
echo "<br>";
echo "<br>";
echo "<br>";
// decode response
$d = json_decode($chat);
// Get Content
echo($d->choices[0]->message->content);
Run the server with the following command
php -S localhost:8000 -t .
orhanerday/open-ai supports Nvidia NIM. The below example is MixtralAI. Check https://build.nvidia.com/explore/discover for more examples.
<?php
require __DIR__ . '/vendor/autoload.php'; // remove this line if you use a PHP Framework.
use Orhanerday\OpenAi\OpenAi;
$nvidia_ai_key = getenv('NVIDIA_AI_API_KEY');
error_log($open_ai_key);
$open_ai = new OpenAi($nvidia_ai_key);
$open_ai->setBaseURL("https://integrate.api.nvidia.com");
$chat = $open_ai->chat([
'model' => 'mistralai/mixtral-8x7b-instruct-v0.1',
'messages' => [["role" => "user", "content" => "Write a limmerick about the wonders of GPU computing."]],
'temperature' => 0.5,
'max_tokens' => 1024,
'top_p' => 1,
]);
var_dump($chat);
echo "<br>";
echo "<br>";
echo "<br>";
// decode response
$d = json_decode($chat);
// Get Content
echo ($d->choices[0]->message->content);
According to the following code
$open_ai
is the base variable for all open-ai operations.
use Orhanerday\OpenAi\OpenAi;
$open_ai = new OpenAi(env('OPEN_AI_API_KEY'));
For users who belong to multiple organizations, you can pass a header to specify which organization is used for an API request. Usage from these API requests will count against the specified organization's subscription quota.
$open_ai_key = getenv('OPENAI_API_KEY');
$open_ai = new OpenAi($open_ai_key);
$open_ai->setORG("org-IKN2E1nI3kFYU8ywaqgFRKqi");
You can specify Origin URL with setBaseURL()
method;
$open_ai_key = getenv('OPENAI_API_KEY');
$open_ai = new OpenAi($open_ai_key,$originURL);
$open_ai->setBaseURL("https://ai.example.com/");
You can use some proxy servers for your requests api;
$open_ai->setProxy("http://127.0.0.1:1086");
$open_ai->setHeader(["Connection"=>"keep-alive"]);
You can get cURL info after the request.
$open_ai = new OpenAi($open_ai_key);
echo $open_ai->listModels(); // you should execute the request FIRST!
var_dump($open_ai->getCURLInfo()); // You can call the request
Given a chat conversation, the model will return a chat completion response.
$complete = $open_ai->chat([
'model' => 'gpt-3.5-turbo',
'messages' => [
[
"role" => "system",
"content" => "You are a helpful assistant."
],
[
"role" => "user",
"content" => "Who won the world series in 2020?"
],
[
"role" => "assistant",
"content" => "The Los Angeles Dodgers won the World Series in 2020."
],
[
"role" => "user",
"content" => "Where was it played?"
],
],
'temperature' => 1.0,
'max_tokens' => 4000,
'frequency_penalty' => 0,
'presence_penalty' => 0,
]);
<?php
// Dummy Response For Chat API
$j = '
{
"id":"chatcmpl-*****",
"object":"chat.completion",
"created":1679748856,
"model":"gpt-3.5-turbo-0301",
"usage":{
"prompt_tokens":9,
"completion_tokens":10,
"total_tokens":19
},
"choices":[
{
"message":{
"role":"assistant",
"content":"This is a test of the AI language model."
},
"finish_reason":"length",
"index":0
}
]
}
';
// decode response
$d = json_decode($j);
// Get Content
echo($d->choices[0]->message->content);
Related: ChatGPT Clone Project
Given a prompt, the model will return one or more predicted completions, and can also return the probabilities of alternative tokens at each position.
$complete = $open_ai->completion([
'model' => 'gpt-3.5-turbo-instruct',
'prompt' => 'Hello',
'temperature' => 0.9,
'max_tokens' => 150,
'frequency_penalty' => 0,
'presence_penalty' => 0.6,
]);
This feature might sound familiar from ChatGPT.
Video of demo:
Isimsiz.video.Clipchamp.ile.yapildi.mp4
ChatGPT clone is a simple web application powered by the OpenAI library and built with PHP. It allows users to chat with an AI language model that responds in real-time. Chat history is saved using cookies, and the project requires the use of an API key and enabled SQLite3.
Url of The ChatGPT-Clone Repo https://github.com/orhanerday/ChatGPT
Whether to stream back partial progress. If set, tokens will be sent as data-only server-sent events as they become available, with the stream terminated by a data: [DONE] message.
$open_ai = new OpenAi(env('OPEN_AI_API_KEY'));
$opts = [
'prompt' => "Hello",
'temperature' => 0.9,
"max_tokens" => 150,
"frequency_penalty" => 0,
"presence_penalty" => 0.6,
"stream" => true,
];
header('Content-type: text/event-stream');
header('Cache-Control: no-cache');
$open_ai->completion($opts, function ($curl_info, $data) {
echo $data . "<br><br>";
echo PHP_EOL;
ob_flush();
flush();
return strlen($data);
});
Add this part inside <body>
of the HTML
<div id="divID">Hello</div>
<script>
var eventSource = new EventSource("/");
var div = document.getElementById('divID');
eventSource.onmessage = function (e) {
if(e.data == "[DONE]")
{
div.innerHTML += "<br><br>Hello";
}
div.innerHTML += JSON.parse(e.data).choices[0].text;
};
eventSource.onerror = function (e) {
console.log(e);
};
</script>
You should see a response like the in video;
stream-event.mp4
Creates a new edit for the provided input, instruction, and parameters
$result = $open_ai->createEdit([
"model" => "text-davinci-edit-001",
"input" => "What day of the wek is it?",
"instruction" => "Fix the spelling mistakes",
]);
All DALL·E Examples available in this repo.
Given a prompt, the model will return one or more generated images as urls or base64 encoded.
Creates an image given a prompt.
$complete = $open_ai->image([
"prompt" => "A cat drinking milk",
"n" => 1,
"size" => "256x256",
"response_format" => "url",
]);
Creates an edited or extended image given an original image and a prompt.
You need HTML upload for image edit or variation? Please check DALL·E Examples
$otter = curl_file_create(__DIR__ . './files/otter.png');
$mask = curl_file_create(__DIR__ . './files/mask.jpg');
$result = $open_ai->imageEdit([
"image" => $otter,
"mask" => $mask,
"prompt" => "A cute baby sea otter wearing a beret",
"n" => 2,
"size" => "1024x1024",
]);
Creates a variation of a given image.
$otter = curl_file_create(__DIR__ . './files/otter.png');
$result = $open_ai->createImageVariation([
"image" => $otter,
"n" => 2,
"size" => "256x256",
]);
(Deprecated)
This endpoint is deprecated and will be removed on December 3rd, 2022 OpenAI developed new methods with better performance. Learn more.
Given a query and a set of documents or labels, the model ranks each document based on its semantic similarity to the provided query.
$search = $open_ai->search([
'engine' => 'ada',
'documents' => ['White House', 'hospital', 'school'],
'query' => 'the president',
]);
Get a vector representation of a given input that can be easily consumed by machine learning models and algorithms.
Related guide: Embeddings
$result = $open_ai->embeddings([
"model" => "text-similarity-babbage-001",
"input" => "The food was delicious and the waiter..."
]);
(Deprecated)
This endpoint is deprecated and will be removed on December 3rd, 2022 We’ve developed new methods with better performance. Learn more.
Given a question, a set of documents, and some examples, the API generates an answer to the question based on the information in the set of documents. This is useful for question-answering applications on sources of truth, like company documentation or a knowledge base.
$answer = $open_ai->answer([
'documents' => ['Puppy A is happy.', 'Puppy B is sad.'],
'question' => 'which puppy is happy?',
'search_model' => 'ada',
'model' => 'curie',
'examples_context' => 'In 2017, U.S. life expectancy was 78.6 years.',
'examples' => [['What is human life expectancy in the United States?', '78 years.']],
'max_tokens' => 5,
'stop' => ["\n", '<|endoftext|>'],
]);
(Deprecated)
This endpoint is deprecated and will be removed on December 3rd, 2022 OpenAI developed new methods with better performance. Learn more.
Given a query and a set of labeled examples, the model will predict the most likely label for the query. Useful as a drop-in replacement for any ML classification or text-to-label task.
$classification = $open_ai->classification([
'examples' => [
['A happy moment', 'Positive'],
['I am sad.', 'Negative'],
['I am feeling awesome', 'Positive'],
],
'labels' => ['Positive', 'Negative', 'Neutral'],
'query' => 'It is a raining day =>(',
'search_model' => 'ada',
'model' => 'curie',
]);
Given a input text, outputs if the model classifies it as violating OpenAI's content policy.
$flags = $open_ai->moderation([
'input' => 'I want to kill them.'
]);
Know more about Content Moderations here: OpenAI Moderations
(Deprecated)
The Engines endpoints are deprecated. Please use their replacement, Models, instead. Learn more.
Lists the currently available engines, and provides basic information about each one such as the owner and availability.
$engines = $open_ai->engines();
$result = $open_ai->tts([
"model" => "tts-1", // tts-1-hd
"input" => "I'm going to use the stones again. Hey, we'd be going in short-handed, you know",
"voice" => "alloy", // echo, fable, onyx, nova, and shimmer
]);
// Save audio file
file_put_contents('tts-result.mp3', $result);
Transcribes audio into the input language.
$c_file = curl_file_create(__DIR__ . '/files/en-marvel-endgame.m4a');
$result = $open_ai->transcribe([
"model" => "whisper-1",
"file" => $c_file,
]);
{
"text": "I'm going to use the stones again. Hey, we'd be going in short-handed, you know. Look, he's still got the stones, so... So let's get them. Use them to bring everyone back. Just like that? Yeah, just like that. Even if there's a small chance that we can undo this, I mean, we owe it to everyone who's not in this room to try. If we do this, how do we know it's going to end any differently than it did before? Because before you didn't have me. Hey, little girl, everybody in this room is about that superhero life. And if you don't mind my asking, where the hell have you been all this time? There are a lot of other planets in the universe. But unfortunately, they didn't have you guys. I like this one. Let's go get this son of a bitch."
}
Translates audio into English.
I use Turkish voice for translation thanks to famous science YouTuber Barış Özcan
$c_file = curl_file_create(__DIR__ . '/files/tr-baris-ozcan-youtuber.m4a');
$result = $open_ai->translate([
"model" => "whisper-1",
"file" => $c_file,
]);
{
"text": "GPT-3. Last month, the biggest leap in the world of artificial intelligence in recent years happened silently. Maybe the biggest leap of all time. GPT-3's beta version was released by OpenAI. When you hear such a sentence, you may think, what kind of leap is this? But be sure, this is the most advanced language model with the most advanced language model with the most advanced language ability. It can answer these artificial intelligence questions, it can translate and even write poetry. Those who have gained access to the API or API of GPT-3 have already started to make very interesting experiments. Let's look at a few examples together. Let's start with an example of aphorism. This site produces beautiful words that you can tweet. Start to actually do things with your words instead of just thinking about them."
}
Need HTML upload for audio? Check this section and change api references. Example :
...
echo $open_ai->translate(
[
"purpose" => "answers",
"file" => $c_file,
]
);
...
// OR
...
echo $open_ai->transcribe(
[
"purpose" => "answers",
"file" => $c_file,
]
);
...
Files are used to upload documents that can be used across features like Answers, Search, and Classifications
Returns a list of files that belong to the user's organization.
$files = $open_ai->listFiles();
Upload a file that contains document(s) to be used across various endpoints/features. Currently, the size of all the files uploaded by one organization can be up to 1 GB. Please contact OpenAI if you need to increase the storage limit.
$c_file = curl_file_create(__DIR__ . 'files/sample_file_1.jsonl');
$result = $open_ai->uploadFile([
"purpose" => "answers",
"file" => $c_file,
]);
<form action="index.php" method="post" enctype="multipart/form-data">
Select file to upload:
<input type="file" name="fileToUpload" id="fileToUpload">
<input type="submit" value="Upload File" name="submit">
</form>
<?php
require __DIR__ . '/vendor/autoload.php';
use Orhanerday\OpenAi\OpenAi;
if ($_SERVER['REQUEST_METHOD'] == 'POST') {
ob_clean();
$open_ai = new OpenAi(env('OPEN_AI_API_KEY'));
$tmp_file = $_FILES['fileToUpload']['tmp_name'];
$file_name = basename($_FILES['fileToUpload']['name']);
$c_file = curl_file_create($tmp_file, $_FILES['fileToUpload']['type'], $file_name);
echo "[";
echo $open_ai->uploadFile(
[
"purpose" => "answers",
"file" => $c_file,
]
);
echo ",";
echo $open_ai->listFiles();
echo "]";
}
$result = $open_ai->deleteFile('file-xxxxxxxx');
$file = $open_ai->retrieveFile('file-xxxxxxxx');
$file = $open_ai->retrieveFileContent('file-xxxxxxxx');
Manage fine-tuning jobs to tailor a model to your specific training data.
$result = $open_ai->createFineTune([
"model" => "gpt-3.5-turbo-1106",
"training_file" => "file-U3KoAAtGsjUKSPXwEUDdtw86",
]);
$fine_tunes = $open_ai->listFineTunes();
$fine_tune = $open_ai->retrieveFineTune('ft-AF1WoRqd3aJAHsqc9NY7iL8F');
$result = $open_ai->cancelFineTune('ft-AF1WoRqd3aJAHsqc9NY7iL8F');
$fine_tune_events = $open_ai->listFineTuneEvents('ft-AF1WoRqd3aJAHsqc9NY7iL8F');
$result = $open_ai->deleteFineTune('curie:ft-acmeco-2021-03-03-21-44-20');
(Deprecated)
Retrieves an engine instance, providing basic information about the engine such as the owner and availability.
$engine = $open_ai->engine('davinci');
List and describe the various models available in the API.
Lists the currently available models, and provides basic information about each one such as the owner and availability.
$result = $open_ai->listModels();
Retrieves a model instance, providing basic information about the model such as the owner and permissioning.
$result = $open_ai->retrieveModel("text-ada-001");
echo $search;
Allows you to build AI assistants within your own applications.
Create an assistant with a model and instructions.
$data = [
'model' => 'gpt-3.5-turbo',
'name' => 'my assistant',
'description' => 'my assistant description',
'instructions' => 'you should cordially help me',
'tools' => [],
'file_ids' => [],
];
$assistant = $open_ai->createAssistant($data);
$assistantId = 'asst_zT1LLZ8dWnuFCrMFzqxFOhzz';
$assistant = $open_ai->retrieveAssistant($assistantId);
$assistantId = 'asst_zT1LLZ8dWnuFCrMFzqxFOhzz';
$data = [
'name' => 'my modified assistant',
'instructions' => 'you should cordially help me again',
];
$assistant = $open_ai->modifyAssistant($assistantId, $data);
$assistantId = 'asst_DgiOnXK7nRfyvqoXWpFlwESc';
$assistant = $open_ai->deleteAssistant($assistantId);
Returns a list of assistants.
$query = ['limit' => 10];
$assistants = $open_ai->listAssistants($query);
Create an assistant file by attaching a File to an assistant.
$assistantId = 'asst_zT1LLZ8dWnuFCrMFzqxFOhzz';
$fileId = 'file-jrNZZZBAPGnhYUKma7CblGoR';
$file = $open_ai->createAssistantFile($assistantId, $fileId);
$assistantId = 'asst_zT1LLZ8dWnuFCrMFzqxFOhzz';
$fileId = 'file-jrNZZZBAPGnhYUKma7CblGoR';
$file = $open_ai->retrieveAssistantFile($assistantId, $fileId);
$assistantId = 'asst_zT1LLZ8dWnuFCrMFzqxFOhzz';
$fileId = 'file-jrNZZZBAPGnhYUKma7CblGoR';
$file = $open_ai->deleteAssistantFile($assistantId, $fileId);
Returns a list of assistant files.
$assistantId = 'asst_zT1LLZ8dWnuFCrMFzqxFOhzz';
$query = ['limit' => 10];
$files = $open_ai->listAssistantFiles($assistantId, $query);
Create threads that assistants can interact with.
$data = [
'messages' => [
[
'role' => 'user',
'content' => 'Hello, what is AI?',
'file_ids' => [],
],
],
];
$thread = $open_ai->createThread($data);
$threadId = 'thread_YKDArENVWFDO2Xz3POifFYlp';
$thread = $open_ai->retrieveThread($threadId);
$threadId = 'thread_YKDArENVWFDO2Xz3POifFYlp';
$data = [
'metadata' => ['test' => '1234abcd'],
];
$thread = $open_ai->modifyThread($threadId, $data);
$threadId = 'thread_YKDArENVWFDO2Xz3POifFYlp';
$thread = $open_ai->deleteThread($threadId);
Create messages within threads.
$threadId = 'thread_YKDArENVWFDO2Xz3POifFYlp';
$data = [
'role' => 'user',
'content' => 'How does AI work? Explain it in simple terms.',
];
$message = $open_ai->createThreadMessage($threadId, $data);
$threadId = 'thread_d86alfR2rfF7rASyV4V7hicz';
$messageId = 'msg_d37P5XgREsm6BItOcppnBO1b';
$message = $open_ai->retrieveThreadMessage($threadId, $messageId);
$threadId = 'thread_d86alfR2rfF7rASyV4V7hicz';
$messageId = 'msg_d37P5XgREsm6BItOcppnBO1b';
$data = [
'metadata' => ['test' => '1234abcd'],
];
$message = $open_ai->modifyThreadMessage($threadId, $messageId, $data);
Returns a list of messages for a given thread.
$threadId = 'thread_d86alfR2rfF7rASyV4V7hicz';
$query = ['limit' => 10];
$messages = $open_ai->listThreadMessages($threadId, $query);
$threadId = 'thread_d86alfR2rfF7rASyV4V7hicz';
$messageId = 'msg_CZ47kAGZugAfeHMX6bmJIukP';
$fileId = 'file-CRLcY63DiHphWuBrmDWZVCgA';
$file = $open_ai->retrieveMessageFile($threadId, $messageId, $fileId);
Returns a list of message files.
$threadId = 'thread_d86alfR2rfF7rASyV4V7hicz';
$messageId = 'msg_CZ47kAGZugAfeHMX6bmJIukP';
$query = ['limit' => 10];
$files = $open_ai->listMessageFiles($threadId, $messageId, $query);
Represents an execution run on a thread.
$threadId = 'thread_d86alfR2rfF7rASyV4V7hicz';
$data = ['assistant_id' => 'asst_zT1LLZ8dWnuFCrMFzqxFOhzz'];
$run = $open_ai->createRun($threadId, $data);
$threadId = 'thread_JZbzCYpYgpNb79FNeneO3cGI';
$runId = 'run_xBKYFcD2Jg3gnfrje6fhiyXj';
$run = $open_ai->retrieveRun($threadId, $runId);
$threadId = 'thread_JZbzCYpYgpNb79FNeneO3cGI';
$runId = 'run_xBKYFcD2Jg3gnfrje6fhiyXj';
$data = [
'metadata' => ['test' => 'abcd1234'],
];
$run = $open_ai->modifyRun($threadId, $runId, $data);
Returns a list of runs belonging to a thread.
$threadId = 'thread_JZbzCYpYgpNb79FNeneO3cGI';
$query = ['limit' => 10];
$runs = $open_ai->listRuns($threadId, $query);
When a run has the status: "requires_action" and required_action.type is submit_tool_outputs, this endpoint can be used to submit the outputs from the tool calls once they're all completed. All outputs must be submitted in a single request.
$threadId = 'thread_JZbzCYpYgpNb79FNeneO3cGI';
$runId = 'run_xBKYFcD2Jg3gnfrje6fhiyXj';
$outputs = [
'tool_outputs' => [
['tool_call_id' => 'call_abc123', 'output' => '28C'],
],
];
$run = $open_ai->submitToolOutputs($threadId, $runId, $outputs);
Cancels a run that is "in_progress".
$threadId = 'thread_JZbzCYpYgpNb79FNeneO3cGI';
$runId = 'run_xBKYFcD2Jg3gnfrje6fhiyXj';
$run = $open_ai->cancelRun($threadId, $runId);
Create a thread and run it in one request.
$data = [
'assistant_id' => 'asst_zT1LLZ8dWnuFCrMFzqxFOhzz',
'thread' => [
'messages' => [
[
'role' => 'user',
'content' => 'Hello, what is AI?',
'file_ids' => [],
],
],
],
];
$run = $open_ai->createThreadAndRun($data);
Retrieves a step in execution of a run.
$threadId = 'thread_JZbzCYpYgpNb79FNeneO3cGI';
$runId = 'run_xBKYFcD2Jg3gnfrje6fhiyXj';
$stepId = 'step_kwLG0vPQjqVyQHVoL7GVK3aG';
$step = $open_ai->retrieveRunStep($threadId, $runId, $stepId);
Returns a list of run steps belonging to a run.
$threadId = 'thread_JZbzCYpYgpNb79FNeneO3cGI';
$runId = 'run_xBKYFcD2Jg3gnfrje6fhiyXj';
$query = ['limit' => 10];
$steps = $open_ai->listRunSteps($threadId, $runId, $query);
To run all tests:
composer test
To run only those tests that work for most user (exclude those that require a missing folder or that hit deprecated endpoints no longer available to most users):
./vendor/bin/pest --group=working
Please see CHANGELOG for more information on what has changed recently.
Please see CONTRIBUTING for details.
Please report security vulnerabilities to [email protected]
The MIT License (MIT). Please see License File for more information.