diff --git a/docs/academy/ai-factory.mdx b/docs/academy/ai-factory.mdx index f9a049e9006..fa7709fe9a1 100644 --- a/docs/academy/ai-factory.mdx +++ b/docs/academy/ai-factory.mdx @@ -33,7 +33,7 @@ The AI factory is a decentralized framework where anyone can collaborate and gov AI Factory is a set of : -- Rules & governance templates : data requirements defiitions, contribution quantification, token factory for DAO’s +- Rules & governance templates : data requirements defiitions, contribution quantification, token factory for DAO's - Open source algorithms ready to be trained and tailored services/workflows - Integrations and connectors diff --git a/docs/academy/consume-resources.mdx b/docs/academy/consume-resources.mdx index ad01ec15325..bb846f2aaba 100644 --- a/docs/academy/consume-resources.mdx +++ b/docs/academy/consume-resources.mdx @@ -27,7 +27,7 @@ This can be formalized very simply, with a set of resources as input and a set o Once we have defined the elementary processing unit, we can combine them to form a more complex processing structure called a Workflow. A workflow is a plan that outlines a series of activities, where some activities depend on the output of others. These interdependencies form a Directed Acyclic Graph (DAG) structure, ensuring that the workflow progresses without circular references or loops. -Let’s consider an example to illustrate this concept: +Let's consider an example to illustrate this concept: ![consume-resources-2](/img/content/academy/consume-resources-2.webp) @@ -47,7 +47,7 @@ Here's a detailed breakdown: 4. **Output and Integration**: The output of these workflows may vary based on the nature of the task. It could be a new dataset, data processing results, software outputs such as trained AI, etc. This output might also contribute to the Dataverse, further enriching the ecosystem. -5. **Payments**: Once the workflow is finished, tokens are sent to the different resource providers according to the business model. Excess tokens may be sent back to the consumer. Eventual slashing can be applied to providers who didn’t provide the expected service. +5. **Payments**: Once the workflow is finished, tokens are sent to the different resource providers according to the business model. Excess tokens may be sent back to the consumer. Eventual slashing can be applied to providers who didn't provide the expected service. 6. **Flexibility and Adaptation**: Workflows in OKP4 are designed to be flexible, allowing for adaptation per changing needs or rules within the Zones. This ensures that workflows can evolve in response to new opportunities or requirements. @@ -70,7 +70,7 @@ The workflow is purely declarative and merely expresses the activities to be per ![consume-resources-4](/img/content/academy/consume-resources-4.webp) -In its centralized perspective, orchestration relies on a single authority responsible for taking the necessary actions to accomplish a set of activities in response to a declarative expression of needs. This authority holds the logic for execution. Let’s consider the example of Kubernetes — a well-known Container Orchestrator — its API enables the "declaration" of resources and will manage the operations and lifecycle logic associated with these resources. +In its centralized perspective, orchestration relies on a single authority responsible for taking the necessary actions to accomplish a set of activities in response to a declarative expression of needs. This authority holds the logic for execution. Let's consider the example of Kubernetes — a well-known Container Orchestrator — its API enables the "declaration" of resources and will manage the operations and lifecycle logic associated with these resources. By adopting a decentralized approach, one would expect the blockchain to handle orchestration. However, due to its highly constrained communication capabilities with the external environment, it cannot fulfill this role effectively. Therefore, the most suitable approach is to delegate the execution of workflows to a dedicated off-chain resource: **the Orchestration Service**. This specific resource takes on the orchestration role while the protocol maintains its position as the source of truth and ultimate authority to which it must adhere. diff --git a/docs/academy/decentralized-identity.mdx b/docs/academy/decentralized-identity.mdx index ab40fdccf2a..79555116f5d 100644 --- a/docs/academy/decentralized-identity.mdx +++ b/docs/academy/decentralized-identity.mdx @@ -175,7 +175,7 @@ You can use this tool to resolve a DID : https://resolver.identity.foundation/ ### A verification method: the did:key Method The **did:key method** is a specific way to create and use DIDs that is focused on simplicity and universality. Here's an overview of the main features: -Direct Incorporation of Public Key: In the did:key method, the DID directly encodes the public key itself. This means the DID is self-describing and doesn’t require an external resolution to a DID document. +Direct Incorporation of Public Key: In the did:key method, the DID directly encodes the public key itself. This means the DID is self-describing and doesn't require an external resolution to a DID document. - Simplicity: It is one of the simplest forms of DID, as it doesn't rely on a blockchain or a distributed ledger. The did:key method generates DIDs that are entirely independent of any registry, network, or company. - Instantaneous Resolution: Because the public key information is embedded in the DID, resolving a did:key DID to its DID document is a straightforward, computation-only process. There is no need to interact with a ledger or network to retrieve the DID document. @@ -184,7 +184,7 @@ Direct Incorporation of Public Key: In the did:key method, the DID directly enco A did:key DID looks like this: did:key:z12ab34cd56ef78gh90i... The limitations of the did:key method are: -- No Dynamic Updates: Since did:key DIDs are static and don’t reference an external source for their DID document, they cannot be updated. Any change in the key or other details requires the generation of a new DID. +- No Dynamic Updates: Since did:key DIDs are static and don't reference an external source for their DID document, they cannot be updated. Any change in the key or other details requires the generation of a new DID. - No Privacy Layer: The method exposes the public key openly, which might not be desirable in all scenarios, especially where privacy is a concern. # Decentralized identity in OKP4 Protocol diff --git a/docs/academy/describe-resource.mdx b/docs/academy/describe-resource.mdx index 6787f0671be..c08d6bb037c 100644 --- a/docs/academy/describe-resource.mdx +++ b/docs/academy/describe-resource.mdx @@ -33,7 +33,7 @@ The OKP4 Protocol can provide you keys (also called a wallet) and DID. A key pai When creating a wallet, you're typically provided with a mnemonic consisting of 12, 24, or sometimes more words. This mnemonic acts as a human-readable representation of the underlying cryptographic information. It is easier to remember and write down than the complex numbers and characters representing the private key. The mnemonic serves as a backup mechanism for the wallet. You can regenerate the key pair by inputting the mnemonic into any compatible wallet software to recover access to your funds. -Here, we don’t create a wallet for cryptocurrency storage but for secure storage of identifiers: A wallet can securely store digital identifiers, such as proofs of identity, certificates, or other personal identification information. +Here, we don't create a wallet for cryptocurrency storage but for secure storage of identifiers: A wallet can securely store digital identifiers, such as proofs of identity, certificates, or other personal identification information. :::danger A keyring is a secure software utility designed to store and manage credentials, such as passwords, cryptographic keys, and API tokens, in a centralized and encrypted form. `--keyring-backend test` is a command parameter used to configure a keyring in test mode, which is helpful for development and testing but not for applications where private key security is a major concern. Be careful; this keyring is unsafe as the private keys are unencrypted on the file system. @@ -104,7 +104,7 @@ For the dataset, instantiate the template [credential-dataset-description](https For the service, instantiate the template [credential-digital-service-description](https://github.com/okp4/ontology/blob/main/src/example/digital-service/ipfs-digital-service-description.jsonld). :::info -Would you like to describe your dataset with other properties? You can create your template of credentials. Don’t directly add these properties in the DatasetDescriptionCredential as defined in the Ontology. +Would you like to describe your dataset with other properties? You can create your template of credentials. Don't directly add these properties in the DatasetDescriptionCredential as defined in the Ontology. ::: ### Description of the dataset @@ -121,7 +121,7 @@ Here are the following metadata of the dataset [Crime Data from 2020 to Present] | Temporal coverage | 2020 to Present | | Topic | Security | -Let’s fill in the template. +Let's fill in the template. ```json { @@ -223,7 +223,7 @@ Fields to modify: - `hasPublisher': fill in the name of the entity primarily responsible for making the Digital Service available - `hasTag`: fill in a list of tags - `hasTitle`: fill in the title of the service -- `hasWebpage`: fill in the URL of the service’s webpage +- `hasWebpage`: fill in the URL of the service's webpage - `id` (issuer): copy the did:key of the issuer - `name`: put the issuer's name (string). @@ -249,7 +249,7 @@ The OKP4 blockchain can only register VCs in N-Quads format. Then, you must conv ## Step 4: Register the credentials in the blockchain -The final step is to register the VCs in the OKP4 blockchain by submitting them to the Dataverse smart contract. It’s the role of the Registrant (who can be the Holder or another entity). +The final step is to register the VCs in the OKP4 blockchain by submitting them to the Dataverse smart contract. It's the role of the Registrant (who can be the Holder or another entity). :::info Note that as you interact with the OKP4 blockchain, you must pay fees in $KNOW at each transaction. @@ -270,5 +270,5 @@ The Protocol will check the signature and if the public key corresponds to the p The command returns the hash of the transaction. You can find more details of this transaction in the [Explorer](https://explore.okp4.network/). Select the network (Currently Drunemeton-Testnet), click on the Search icon, and paste the transaction hash. :::warning -Remember, the Resource Governance VC ensures the existence of a resource in the protocol: a resource or zone exists in the protocol only if governance is attached to it. Let’s do it on the next page. -::: \ No newline at end of file +Remember, the Resource Governance VC ensures the existence of a resource in the protocol: a resource or zone exists in the protocol only if governance is attached to it. Let's do it on the next page. +::: diff --git a/docs/academy/describe-zone.mdx b/docs/academy/describe-zone.mdx index ef5a790ffcb..c6407980c92 100644 --- a/docs/academy/describe-zone.mdx +++ b/docs/academy/describe-zone.mdx @@ -10,7 +10,7 @@ A [zone](https://docs.okp4.network/whitepaper/solution#zone-overview) is a conce All the workflows initiated within the OKP4 protocol must be defined in the context of a specific Zone. When the workflow is submitted, the protocol will check if all rules of the zone and of the engaged resources are respected and will apply the business model of the zone. -Let’s see how to describe and register a zone in the OKP4 Protocol. Here are the four steps involved: +Let's see how to describe and register a zone in the OKP4 Protocol. Here are the four steps involved:
Steps to describe a resource in the Dataverse @@ -30,7 +30,7 @@ The OKP4 Protocol can provide you keys (also called a wallet) and DID. A key pai When creating a wallet, you're typically provided with a mnemonic consisting of 12, 24, or sometimes more words. This mnemonic acts as a human-readable representation of the underlying cryptographic information. It is easier to remember and write down than the complex numbers and characters representing the private key. The mnemonic serves as a backup mechanism for the wallet. You can regenerate the key pair by inputting the mnemonic into any compatible wallet software to recover access to your funds. -Here, we don’t create a wallet for cryptocurrency storage but for secure storage of identifiers: A wallet can securely store digital identifiers, such as proofs of identity, certificates, or other personal identification information. +Here, we don't create a wallet for cryptocurrency storage but for secure storage of identifiers: A wallet can securely store digital identifiers, such as proofs of identity, certificates, or other personal identification information. :::danger A keyring is a secure software utility designed to store and manage credentials, such as passwords, cryptographic keys, and API tokens, in a centralized and encrypted form. `--keyring-backend test` is a command used to configure a keyring in test mode, which is helpful for development and testing but not for applications where private key security is a major concern. Be careful; this keyring is unsafe as the private keys are unencrypted on the file system. @@ -98,7 +98,7 @@ Find the credential templates you need in the [Ontology documentation](https://d For the zone, instantiate the template [credential-zone-description](https://github.com/okp4/ontology/blob/main/src/example/zone/collab-ai-zone-description.jsonld). :::info -Would you like to describe your zone with other properties? You can create your template of credentials. Don’t directly add these properties in the ZoneDescriptionCredential as defined in the Ontology. +Would you like to describe your zone with other properties? You can create your template of credentials. Don't directly add these properties in the ZoneDescriptionCredential as defined in the Ontology. ::: ### Description of the Zone @@ -112,7 +112,7 @@ Here are the following metadata of the zone Collaborative AI Zone | Tags | AI, Collaboration, Machine learning | | Topic | Other | -Let’s fill in the template. +Let's fill in the template. ```json { @@ -178,7 +178,7 @@ The OKP4 blockchain can only register VCs in N-Quads format. Then, you must conv ## Step 4: Register the credentials in the blockchain -The final step is to register the VCs in the OKP4 blockchain by submitting them to the Dataverse smart contract. It’s the role of the Registrant (who can be the Holder or another entity). +The final step is to register the VCs in the OKP4 blockchain by submitting them to the Dataverse smart contract. It's the role of the Registrant (who can be the Holder or another entity). :::info Note that as you interact with the OKP4 blockchain, you must pay fees in $KNOW at each transaction. @@ -199,5 +199,5 @@ The Protocol will check the signature and if the public key corresponds to the p The command returns the hash of the transaction. You can find more details of this transaction in the [Explorer](https://explore.okp4.network/). Select the network (Currently Drunemeton-Testnet), click on the Search icon, and paste the transaction hash. :::warning -Remember, the Resource Governance VC ensures the existence of a resource in the protocol: a resource or zone exists in the protocol only if governance is attached to it. Let’s do it on the next page. -::: \ No newline at end of file +Remember, the Resource Governance VC ensures the existence of a resource in the protocol: a resource or zone exists in the protocol only if governance is attached to it. Let's do it on the next page. +::: diff --git a/docs/academy/resource-governance.mdx b/docs/academy/resource-governance.mdx index 731de765dce..0bbf54482ba 100644 --- a/docs/academy/resource-governance.mdx +++ b/docs/academy/resource-governance.mdx @@ -306,7 +306,7 @@ The VC is now in the hands of the Holder. Note that it is possible that the Issu The OKP4 blockchain can only register VCs in N-Quads format. Then, you must convert the jsonld files in N-Quads. You can use this tool: https://transform.tools/jsonld-to-nquads . -The final step is to register the VCs in the OKP4 blockchain by submitting them to the Dataverse smart contract. It’s the role of the Registrant (who can be the Holder or another entity). +The final step is to register the VCs in the OKP4 blockchain by submitting them to the Dataverse smart contract. It's the role of the Registrant (who can be the Holder or another entity). :::info Note that as you interact with the OKP4 blockchain, you must pay fees in $KNOW at each transaction. diff --git a/docs/academy/rules.mdx b/docs/academy/rules.mdx index 2adfd1c3c7f..59597da446b 100644 --- a/docs/academy/rules.mdx +++ b/docs/academy/rules.mdx @@ -45,7 +45,7 @@ The OKP4 protocol acknowledges the dynamic nature of resource management. Rules OKP4 rules are not isolated entities; they can interact with one another. Understanding and defining inter-rule relationships is a powerful aspect of OKP4's flexibility. For example, you can establish dependencies between rules, ensuring that certain conditions must be met before others come into effect. -The ability to define rules is a pivotal feature of the OKP4 protocol, offering an unprecedented level of control and customization. By leveraging these rules, participants can create diverse applications and ecosystems, each with unique governance and operational dynamics. This flexibility is central to OKP4’s vision of fostering a decentralized, collaborative, and innovative digital resource environment. +The ability to define rules is a pivotal feature of the OKP4 protocol, offering an unprecedented level of control and customization. By leveraging these rules, participants can create diverse applications and ecosystems, each with unique governance and operational dynamics. This flexibility is central to OKP4's vision of fostering a decentralized, collaborative, and innovative digital resource environment. ## Prolog: Turing complete logical and declarative programming language @@ -196,7 +196,7 @@ Given the interconnected nature of resources and their rules, OKP4 acknowledges OKP4 places a premium on transparency in resource governance. Resource consents are designed to be transparent and auditable, allowing users and stakeholders to understand how each resource is governed. This transparency fosters trust and contributes to the overall security and accountability within the OKP4 ecosystem. -**7. Let’s practice** +**7. Let's practice** Let's consider a simple example of rules written in Prolog for a hypothetical resource-sharing scenario within the OKP4 context. In this example, we'll create rules for granting access to a specific dataset based on user roles and temporal constraints. diff --git a/docs/academy/who-is-it-for.mdx b/docs/academy/who-is-it-for.mdx index 60c883b04b5..8f3575171a5 100644 --- a/docs/academy/who-is-it-for.mdx +++ b/docs/academy/who-is-it-for.mdx @@ -1,127 +1,127 @@ ---- -sidebar_position: 6 ---- - -import Quiz from 'react-quiz-component'; -import * as quiz from './who-is-it-for-quiz.json'; - -# Who is it for? - -Reading time: {readingTime} min - -OKP4 is often described as a protocol that enables sharing Anything as a Service under any Conditions. To illustrate the infinite possibilities opened up by the protocol, this section provides further examples of the intended audience for the protocol. - -## For Data Providers - -Data Providers reference datasets within the protocol. They are responsible for describing dataset characteristics (metadata) and establishing access conditions for the use of this resource. They interact with the protocol via dedicated smart contracts to register datasets, define their access conditions and metadata, and make the dataset available and accessible. - -With the implementation of dataset access conditions, Data Providers gain the ability to define how they wish their resources to be utilized precisely. This includes specifying whether a business model is associated with the dataset, determining the privacy status of the data, and elucidating any other rights or licenses linked to the dataset. These rules empower Data Providers to intricately shape the terms under which their valuable resources are accessed, fostering transparency and control over their datasets' usage and potential monetization. - -### Data Providers, who are they? - -Data Scientists: As innovation engines, Data Scientists play a crucial role in providing enriched and qualified data. Their expertise opens up diverse opportunities across various sectors such as healthcare, agriculture, commerce, transportation, industry, etc. A concrete example could be a Data Scientist in the healthcare domain sharing anonymized datasets related to medical diagnostics. OKP4 represents a new frontier, ensuring sovereignty over datasets. This creates a significant opportunity for collaboration and value creation while preserving the intellectual property of data, providing fair compensation and traceability of usage. - -Individuals: Individuals are becoming active contributors by sharing any kind of datasets. It can be data they collected and curated or even personal data, including shopping habits, internet navigation, location, and health data. For instance, individuals might voluntarily share geolocation data to contribute to environmental analyses. OKP4 provides an infrastructure for sharing and valuing this data while ensuring that access conditions established by the provider are respected (compensation, confidentiality, protection of individual rights). Unlike Data Scientists, individuals may require intuitive no-code interfaces, and OKP4, along with projects building applications on the protocol, will provide the necessary tools to make the user experience accessible. - -Sensors: Acting as permanent sources of streaming data inflows, sensors provide real-time information in various domains such as environmental monitoring, logistics, and connected health, enriching predictive analytics. As a decentralized identity (DID) within the protocol, Sensors can provide data according to defined rules. - -Protocols, dApps, and nodes: Decentralized protocols and dApps generate massive amounts of raw data. An example could be a DeFi protocol sharing raw data for in-depth analyses of decentralized financial activities. OKP4 enables these actors to make their data more easily accessible and exploitable, facilitating in-depth analyses of user behavior, smart contracts, or others. - -NGO: Non-profit organizations can share data on social, environmental, and health topics, catalyzing research and social action initiatives. Making such data available for a wide range of use cases. - -Companies: Motivated by collaborations or valorizing their data, companies find a flexible framework in OKP4. They can compose designs that meet their needs for permissions and robust data processing, especially for sensitive data. - -While not exhaustive, this classification highlights the diversity of actors participating in data sharing. Generating future knowledge requires various skills and multidisciplinary analyses, emphasizing the importance of diverse data sources. Each professional, individual, or organizational category contributes to building a robust, ethical, and innovation-friendly decentralized data ecosystem globally. - -## For Service Providers - -Service Providers represent another essential component within the OKP4 ecosystem, crucial in providing and managing digital services. They offer services such as algorithms, software, storage systems, or any other digital item requiring processing time. - -Depending on access conditions and Zone rules, Service Providers may be required to lock tokens on services to ensure service availability and integrity. This stake incentivizes Service Providers to guarantee off-chain service execution when invoked and can be slashed if the service level is not respected. - -Service Providers interact with the protocol by referencing their services and access conditions via smart contracts, ensuring the availability and functioning of their service by on-chain registered specifications. - -### Service providers: Who are they? - -AI Builders: AI builders, encompassing both algorithm and AI developers, play an indispensable role within the OKP4 ecosystem, contributing diverse machine learning solutions. These builders can provide a multitude of models, including foundation models, generative models, or more specialized ones such as text-to-speech and voice recognition. The degree of pre-training in these models can be adjusted based on consumers' needs, making them adaptable for various purposes, including additional training, fine-tuning, and inference. - -Software, BI and data-driven devs: These providers are crucial in delivering data processing services. These providers enable more efficient and precise data processing by offering comprehensive solutions for handling, analyzing, and transforming data, extracting valuable insights and knowledge. -This diverse array of services encompasses real-time data processing solutions, indexing services, data cleaning services, and data integration services, to name a few. These services contribute not only to the effective management of data but also to the extraction of meaningful information and the creation of value. - -Infrastructure Providers: Infrastructure providers operate in the domains of computation, storage, and service orchestration, to name a few, supplying essential technological building blocks crucial for workflow realization. As an agnostic and interoperable protocol, OKP4 facilitates the connection to any infrastructure solution, be it cloud storage options, self-hosted cloud storage, centralized computation, proprietary computational resources, or decentralized options. This flexibility applies to both individuals and major cloud service providers. - -Third-party identity service: This is a critical infrastructure component for OKP4's proper functioning, assigning a DID (Decentralized Identity) to each resource within the OKP4 network. This approach enables composability with most identification infrastructures, notably adhering to W3C standards. This feature ensures efficient identity management, enhancing security and traceability within the OKP4 network. - -Other Protocols: Any off-chain resources provided by a decentralized blockchain or app can be made available through OKP4 as long as the adapted connector exists. Synergies can go beyond referencing off-chain datasets or services, though. As mentioned in the "Protocol Concept - Interoperability" section, by leveraging the cosmos stack, OKP4 can seamlessly integrate other blockchains, such as Akash or Jackal, with maximum security and minimized trust through IBC by leveraging the Cosmos stack. By multiplying decentralized service offerings, OKP4 empowers users to compose their workflows with granularity based on their specific needs. - -These are just a few examples of essential service providers crucial to properly functioning the protocol, aiming to broaden your understanding of the spectrum of possibilities. Nevertheless, it's necessary to grasp that with the impressive development of AI models, some of the technical components used today will be entirely disrupted by what will be developed tomorrow. The Open Knowledge Protocol is designed to guide providers through this transition, opening up new opportunities in the knowledge economy. - -## For Consumers - -Consumers initiate workflows on shared resources from providers to access or generate knowledge. They can consume simple workflows, like downloading a dataset, or more intricate processes involving interactions with tens or hundreds of datasets and services. - -Consumers interact with the protocol by initiating on-chain transactions invoking smart contracts to request workflow execution. They must pay the fees and rewards required to start the workflow execution. - -### Consumers: Who are they? -Similar to providers, consumers encompass a variety of entities, such as individuals, companies, AI agents, NGOs, etc... - -One of the primary benefits for consumers lies in gaining access to a broader array of resources. As providers can set the conditions for resource access, they are incentivized to showcase their resources, presenting a more diversified range of options and solutions for consumers. Consumers leverage the output of workflows in various ways depending on their objectives. - -Here are some examples to help you understand who they are and why they use OKP4: - -### Companies - -Generally, companies will leverage OKP4 either to create and feed an application they will make available to their clients or to harness the generated knowledge for their own use. - -**A company aiming to harness Dataverse resources to craft its own XaaS:** - -Let’s take a financial analysis company that seeks to create its application by leveraging the Dataverse’s resources, creating an advanced financial service accessible through a front-end. Workflows can aggregate data from diverse providers, process them according to specific needs, and deliver a personalized XaaS service. A few examples of potential applications: -- Data aggregation Application -- Business Intelligence (BI) application -- In-depth financial analysis application -- Predictive modeling services -- Monitoring and alerting Tool - -Another example is a company that aims to create a mobile health application tailored for individuals. By thoughtfully orchestrating relevant resources and services from Dataverse, the company can configure the application to execute workflows prioritizing data privacy. It ranges from a large broad of applications: -- Personal Health Monitoring -- Medical Appointment Management -- Wellness Data Analysis -- Personalized Health Advice -- Secure Information Sharing with Healthcare Professionals - -**A company aiming to harness Dataverse resources to feed its own infrastructure or knowledge base:** - -Most technology companies seek to modernize their IT infrastructure, transitioning from a monolithic architecture to a more modular, service-oriented approach. Leveraging OKP4 allows for gradually decomposing existing features into microservices, ensuring confident utilization of shared resources. This optimization enhances operational efficiency, simplifies maintenance, and enables swift adaptation to market changes within a trust-minimized environment using OKP4. - -Another example is an agricultural company aiming to expand its knowledge base to optimize farming practices, improve crop yields, and stay abreast of agricultural innovations. By harnessing Dataverse resources, this company can employ tailored workflows to gather diverse datasets related to soil composition, weather patterns, crop diseases, pest management strategies, and agricultural market trends. These datasets can be processed, analyzed, and integrated into the company's knowledge base. For instance, the company might use OKP4 to aggregate soil quality data from various sources, analyze historical weather patterns to predict future climate trends and identify effective pest management techniques based on data-driven insights. Additionally, the company can integrate market trend data to make informed decisions about crop selection, pricing strategies, and market demand. - - -### AI Agents - -AI agents, sophisticated software entities powered with artificial intelligence, autonomously execute tasks or services, driven by knowledge of user goals. They constantly learn through machine learning and personalize interactions, adapting behaviors based on the context. - -AI agents don't just interpret data, they actively contribute to complex decision-making. As a decentralized coordinator, OKP4 sees AI agents not just as digital services but as full-fledged actors. This perspective acknowledges that AI agents go beyond being passive services and instead play an active role in influencing and participating in decision-making processes. - -Here are a few examples illustrating the versatility and practical applications of OKP4 for AI agent as a consumer: - -**AI Agent Enhancing Database through Dataset Scrutiny:** An AI agent, seamlessly integrated into a market analysis platform, utilizes workflows within Dataverse to extract, analyze, and enrich a database. For instance, it can extract pertinent market trends from diverse datasets, significantly enhancing the quality and depth of available analyses. - -**Security Monitoring with On-Chain Data:** An AI agent specialized in security monitoring engages with on-chain data to analyze a dApp or a blockchain. Within a security framework, this AI agent leverages workflows to scrutinize on-chain transactions and events, identifying potential security threats and vulnerabilities. - -**Real-time Data Interpretation with Streaming Data Analysis:** As a consumer on OKP4, the AI agent harnesses data from diverse streaming sources, enabling real-time interpretation with accurate and up-to-date information. This ensures effective, traceable, and trust-minimized monitoring. - -**AI Agent for Forecasting Across Domains:** In various domains, such as weather, finance, and supply-chain prediction, the AI agent leverages OKP4 to access a variety of datasets. The agent efficiently performs forecasting tasks using the protocol, delivering valuable insights and predictions. This adaptability ensures efficiency in forecasting across different fields. - -### Other consumers and sum-up - -Although we have explored only two types of entities, this framework seamlessly extends to various entities. While the core function of workflows remains consistent, the diversity of objectives becomes apparent. Whether profit-driven companies, NGOs striving for social impact, or individuals contributing voluntarily, the protocol's adaptability fosters collaboration across a spectrum of goals. OKP4 functions as an orchestration layer where the same fundamental work is tailored to meet varied objectives, from financial gain to free and open contributions, all within this dynamic digital collaboration space. - -## Same entities but different roles - -In exploring the various actors within the protocol, we observe that most are both providers and consumers. - -The nuanced nature of OKP4 blurs the traditional boundaries between these roles. Entities seamlessly transition between these roles based on their context and specific objectives. As a protocol, OKP4 directly addresses builders and exploiters of digital resources, fostering a collaborative environment where each participant, specialized and competent in their domain, adds value to creating new knowledge. - -It is not uncommon to witness entities functioning as providers in specific scenarios while assuming the role of consumers in others. This fluidity underscores the adaptability and versatility inherent in OKP4. As we navigate this decentralized digital landscape, OKP4 emerges as a facilitator, providing a framework for a myriad of actors to engage, collaborate, and collectively contribute to generating innovative knowledge. - - +--- +sidebar_position: 6 +--- + +import Quiz from 'react-quiz-component'; +import * as quiz from './who-is-it-for-quiz.json'; + +# Who is it for? + +Reading time: {readingTime} min + +OKP4 is often described as a protocol that enables sharing Anything as a Service under any Conditions. To illustrate the infinite possibilities opened up by the protocol, this section provides further examples of the intended audience for the protocol. + +## For Data Providers + +Data Providers reference datasets within the protocol. They are responsible for describing dataset characteristics (metadata) and establishing access conditions for the use of this resource. They interact with the protocol via dedicated smart contracts to register datasets, define their access conditions and metadata, and make the dataset available and accessible. + +With the implementation of dataset access conditions, Data Providers gain the ability to define how they wish their resources to be utilized precisely. This includes specifying whether a business model is associated with the dataset, determining the privacy status of the data, and elucidating any other rights or licenses linked to the dataset. These rules empower Data Providers to intricately shape the terms under which their valuable resources are accessed, fostering transparency and control over their datasets' usage and potential monetization. + +### Data Providers, who are they? + +Data Scientists: As innovation engines, Data Scientists play a crucial role in providing enriched and qualified data. Their expertise opens up diverse opportunities across various sectors such as healthcare, agriculture, commerce, transportation, industry, etc. A concrete example could be a Data Scientist in the healthcare domain sharing anonymized datasets related to medical diagnostics. OKP4 represents a new frontier, ensuring sovereignty over datasets. This creates a significant opportunity for collaboration and value creation while preserving the intellectual property of data, providing fair compensation and traceability of usage. + +Individuals: Individuals are becoming active contributors by sharing any kind of datasets. It can be data they collected and curated or even personal data, including shopping habits, internet navigation, location, and health data. For instance, individuals might voluntarily share geolocation data to contribute to environmental analyses. OKP4 provides an infrastructure for sharing and valuing this data while ensuring that access conditions established by the provider are respected (compensation, confidentiality, protection of individual rights). Unlike Data Scientists, individuals may require intuitive no-code interfaces, and OKP4, along with projects building applications on the protocol, will provide the necessary tools to make the user experience accessible. + +Sensors: Acting as permanent sources of streaming data inflows, sensors provide real-time information in various domains such as environmental monitoring, logistics, and connected health, enriching predictive analytics. As a decentralized identity (DID) within the protocol, Sensors can provide data according to defined rules. + +Protocols, dApps, and nodes: Decentralized protocols and dApps generate massive amounts of raw data. An example could be a DeFi protocol sharing raw data for in-depth analyses of decentralized financial activities. OKP4 enables these actors to make their data more easily accessible and exploitable, facilitating in-depth analyses of user behavior, smart contracts, or others. + +NGO: Non-profit organizations can share data on social, environmental, and health topics, catalyzing research and social action initiatives. Making such data available for a wide range of use cases. + +Companies: Motivated by collaborations or valorizing their data, companies find a flexible framework in OKP4. They can compose designs that meet their needs for permissions and robust data processing, especially for sensitive data. + +While not exhaustive, this classification highlights the diversity of actors participating in data sharing. Generating future knowledge requires various skills and multidisciplinary analyses, emphasizing the importance of diverse data sources. Each professional, individual, or organizational category contributes to building a robust, ethical, and innovation-friendly decentralized data ecosystem globally. + +## For Service Providers + +Service Providers represent another essential component within the OKP4 ecosystem, crucial in providing and managing digital services. They offer services such as algorithms, software, storage systems, or any other digital item requiring processing time. + +Depending on access conditions and Zone rules, Service Providers may be required to lock tokens on services to ensure service availability and integrity. This stake incentivizes Service Providers to guarantee off-chain service execution when invoked and can be slashed if the service level is not respected. + +Service Providers interact with the protocol by referencing their services and access conditions via smart contracts, ensuring the availability and functioning of their service by on-chain registered specifications. + +### Service providers: Who are they? + +AI Builders: AI builders, encompassing both algorithm and AI developers, play an indispensable role within the OKP4 ecosystem, contributing diverse machine learning solutions. These builders can provide a multitude of models, including foundation models, generative models, or more specialized ones such as text-to-speech and voice recognition. The degree of pre-training in these models can be adjusted based on consumers' needs, making them adaptable for various purposes, including additional training, fine-tuning, and inference. + +Software, BI and data-driven devs: These providers are crucial in delivering data processing services. These providers enable more efficient and precise data processing by offering comprehensive solutions for handling, analyzing, and transforming data, extracting valuable insights and knowledge. +This diverse array of services encompasses real-time data processing solutions, indexing services, data cleaning services, and data integration services, to name a few. These services contribute not only to the effective management of data but also to the extraction of meaningful information and the creation of value. + +Infrastructure Providers: Infrastructure providers operate in the domains of computation, storage, and service orchestration, to name a few, supplying essential technological building blocks crucial for workflow realization. As an agnostic and interoperable protocol, OKP4 facilitates the connection to any infrastructure solution, be it cloud storage options, self-hosted cloud storage, centralized computation, proprietary computational resources, or decentralized options. This flexibility applies to both individuals and major cloud service providers. + +Third-party identity service: This is a critical infrastructure component for OKP4's proper functioning, assigning a DID (Decentralized Identity) to each resource within the OKP4 network. This approach enables composability with most identification infrastructures, notably adhering to W3C standards. This feature ensures efficient identity management, enhancing security and traceability within the OKP4 network. + +Other Protocols: Any off-chain resources provided by a decentralized blockchain or app can be made available through OKP4 as long as the adapted connector exists. Synergies can go beyond referencing off-chain datasets or services, though. As mentioned in the "Protocol Concept - Interoperability" section, by leveraging the cosmos stack, OKP4 can seamlessly integrate other blockchains, such as Akash or Jackal, with maximum security and minimized trust through IBC by leveraging the Cosmos stack. By multiplying decentralized service offerings, OKP4 empowers users to compose their workflows with granularity based on their specific needs. + +These are just a few examples of essential service providers crucial to properly functioning the protocol, aiming to broaden your understanding of the spectrum of possibilities. Nevertheless, it's necessary to grasp that with the impressive development of AI models, some of the technical components used today will be entirely disrupted by what will be developed tomorrow. The Open Knowledge Protocol is designed to guide providers through this transition, opening up new opportunities in the knowledge economy. + +## For Consumers + +Consumers initiate workflows on shared resources from providers to access or generate knowledge. They can consume simple workflows, like downloading a dataset, or more intricate processes involving interactions with tens or hundreds of datasets and services. + +Consumers interact with the protocol by initiating on-chain transactions invoking smart contracts to request workflow execution. They must pay the fees and rewards required to start the workflow execution. + +### Consumers: Who are they? +Similar to providers, consumers encompass a variety of entities, such as individuals, companies, AI agents, NGOs, etc... + +One of the primary benefits for consumers lies in gaining access to a broader array of resources. As providers can set the conditions for resource access, they are incentivized to showcase their resources, presenting a more diversified range of options and solutions for consumers. Consumers leverage the output of workflows in various ways depending on their objectives. + +Here are some examples to help you understand who they are and why they use OKP4: + +### Companies + +Generally, companies will leverage OKP4 either to create and feed an application they will make available to their clients or to harness the generated knowledge for their own use. + +**A company aiming to harness Dataverse resources to craft its own XaaS:** + +Let's take a financial analysis company that seeks to create its application by leveraging the Dataverse's resources, creating an advanced financial service accessible through a front-end. Workflows can aggregate data from diverse providers, process them according to specific needs, and deliver a personalized XaaS service. A few examples of potential applications: +- Data aggregation Application +- Business Intelligence (BI) application +- In-depth financial analysis application +- Predictive modeling services +- Monitoring and alerting Tool + +Another example is a company that aims to create a mobile health application tailored for individuals. By thoughtfully orchestrating relevant resources and services from Dataverse, the company can configure the application to execute workflows prioritizing data privacy. It ranges from a large broad of applications: +- Personal Health Monitoring +- Medical Appointment Management +- Wellness Data Analysis +- Personalized Health Advice +- Secure Information Sharing with Healthcare Professionals + +**A company aiming to harness Dataverse resources to feed its own infrastructure or knowledge base:** + +Most technology companies seek to modernize their IT infrastructure, transitioning from a monolithic architecture to a more modular, service-oriented approach. Leveraging OKP4 allows for gradually decomposing existing features into microservices, ensuring confident utilization of shared resources. This optimization enhances operational efficiency, simplifies maintenance, and enables swift adaptation to market changes within a trust-minimized environment using OKP4. + +Another example is an agricultural company aiming to expand its knowledge base to optimize farming practices, improve crop yields, and stay abreast of agricultural innovations. By harnessing Dataverse resources, this company can employ tailored workflows to gather diverse datasets related to soil composition, weather patterns, crop diseases, pest management strategies, and agricultural market trends. These datasets can be processed, analyzed, and integrated into the company's knowledge base. For instance, the company might use OKP4 to aggregate soil quality data from various sources, analyze historical weather patterns to predict future climate trends and identify effective pest management techniques based on data-driven insights. Additionally, the company can integrate market trend data to make informed decisions about crop selection, pricing strategies, and market demand. + + +### AI Agents + +AI agents, sophisticated software entities powered with artificial intelligence, autonomously execute tasks or services, driven by knowledge of user goals. They constantly learn through machine learning and personalize interactions, adapting behaviors based on the context. + +AI agents don't just interpret data, they actively contribute to complex decision-making. As a decentralized coordinator, OKP4 sees AI agents not just as digital services but as full-fledged actors. This perspective acknowledges that AI agents go beyond being passive services and instead play an active role in influencing and participating in decision-making processes. + +Here are a few examples illustrating the versatility and practical applications of OKP4 for AI agent as a consumer: + +**AI Agent Enhancing Database through Dataset Scrutiny:** An AI agent, seamlessly integrated into a market analysis platform, utilizes workflows within Dataverse to extract, analyze, and enrich a database. For instance, it can extract pertinent market trends from diverse datasets, significantly enhancing the quality and depth of available analyses. + +**Security Monitoring with On-Chain Data:** An AI agent specialized in security monitoring engages with on-chain data to analyze a dApp or a blockchain. Within a security framework, this AI agent leverages workflows to scrutinize on-chain transactions and events, identifying potential security threats and vulnerabilities. + +**Real-time Data Interpretation with Streaming Data Analysis:** As a consumer on OKP4, the AI agent harnesses data from diverse streaming sources, enabling real-time interpretation with accurate and up-to-date information. This ensures effective, traceable, and trust-minimized monitoring. + +**AI Agent for Forecasting Across Domains:** In various domains, such as weather, finance, and supply-chain prediction, the AI agent leverages OKP4 to access a variety of datasets. The agent efficiently performs forecasting tasks using the protocol, delivering valuable insights and predictions. This adaptability ensures efficiency in forecasting across different fields. + +### Other consumers and sum-up + +Although we have explored only two types of entities, this framework seamlessly extends to various entities. While the core function of workflows remains consistent, the diversity of objectives becomes apparent. Whether profit-driven companies, NGOs striving for social impact, or individuals contributing voluntarily, the protocol's adaptability fosters collaboration across a spectrum of goals. OKP4 functions as an orchestration layer where the same fundamental work is tailored to meet varied objectives, from financial gain to free and open contributions, all within this dynamic digital collaboration space. + +## Same entities but different roles + +In exploring the various actors within the protocol, we observe that most are both providers and consumers. + +The nuanced nature of OKP4 blurs the traditional boundaries between these roles. Entities seamlessly transition between these roles based on their context and specific objectives. As a protocol, OKP4 directly addresses builders and exploiters of digital resources, fostering a collaborative environment where each participant, specialized and competent in their domain, adds value to creating new knowledge. + +It is not uncommon to witness entities functioning as providers in specific scenarios while assuming the role of consumers in others. This fluidity underscores the adaptability and versatility inherent in OKP4. As we navigate this decentralized digital landscape, OKP4 emerges as a facilitator, providing a framework for a myriad of actors to engage, collaborate, and collectively contribute to generating innovative knowledge. + + diff --git a/docs/academy/zone-governance.mdx b/docs/academy/zone-governance.mdx index 22e74f7f6cd..a0f9839d005 100644 --- a/docs/academy/zone-governance.mdx +++ b/docs/academy/zone-governance.mdx @@ -285,7 +285,7 @@ The VC is now in the hands of the Holder. Note that it is possible that the Issu The OKP4 blockchain can only register VCs in N-Quads format. Then, you must convert the jsonld files in N-Quads. You can use this tool: https://transform.tools/jsonld-to-nquads . -The final step is to register the VCs in the OKP4 blockchain by submitting them to the Dataverse smart contract. It’s the role of the Registrant (who can be the Holder or another entity). +The final step is to register the VCs in the OKP4 blockchain by submitting them to the Dataverse smart contract. It's the role of the Registrant (who can be the Holder or another entity). :::info Note that as you interact with the OKP4 blockchain, you must pay fees in $KNOW at each transaction. diff --git a/docs/faq/faq.md b/docs/faq/faq.md index dfc9a01c2a1..85761abb2cb 100644 --- a/docs/faq/faq.md +++ b/docs/faq/faq.md @@ -22,15 +22,15 @@ OKP4 serves as a set of rules and conventions that facilitate interoperability a OKP4 does not aiming to be the next 'Ethereum Killer.' Instead, OKP4 plays a pivotal role in facilitating the transition to a new digital revolution, one centered around harnessing the power of data. In today's landscape, we recognize that trust and technical complexities often pose significant limitations. -OKP4 is a meticulously designed protocol that fuels the expansion of this digital revolution by simplifying the sharing of data and digital resources. It places a profound emphasis on respecting consent and the proper utilization of everyone’s assets, thereby fostering a more equitable and secure digital ecosystem +OKP4 is a meticulously designed protocol that fuels the expansion of this digital revolution by simplifying the sharing of data and digital resources. It places a profound emphasis on respecting consent and the proper utilization of everyone's assets, thereby fostering a more equitable and secure digital ecosystem ### What problem is OKP4 solving? -Today’s datasets are stored in silos, their potential and value stay untapped. This situation exists because of a substantial lack of trust and incentives to share data. +Today's datasets are stored in silos, their potential and value stay untapped. This situation exists because of a substantial lack of trust and incentives to share data. -Many companies’ products and crypto projects tried to tackle these issues through a similar approach: data marketplaces. But this approach is highly limiting because 1. the exchanged data is accessible to the buyer, resulting in risks, and 2. pricing mechanisms are disconnected from how data is used, resulting in poor incentives. +Many companies' products and crypto projects tried to tackle these issues through a similar approach: data marketplaces. But this approach is highly limiting because 1. the exchanged data is accessible to the buyer, resulting in risks, and 2. pricing mechanisms are disconnected from how data is used, resulting in poor incentives. -Today, there’s no infrastructure, centralized or decentralized, to easily share any resource and agree on rules to enable these resources to interact with each other. +Today, there's no infrastructure, centralized or decentralized, to easily share any resource and agree on rules to enable these resources to interact with each other. ### What solution is OKP4 providing? @@ -100,7 +100,7 @@ Resource Utilization: When a digital resource is referenced within the OKP4 prot ### Euh Wait… What is an ontology? -Don’t worry, an ontology, in the context of OKP4, refers to a structured representation of knowledge that defines the relationships between various terms or concepts. It's like a map that helps us understand how different referenced resources (datasets, algorithms, services…) are related to each other. +Don't worry, an ontology, in the context of OKP4, refers to a structured representation of knowledge that defines the relationships between various terms or concepts. It's like a map that helps us understand how different referenced resources (datasets, algorithms, services…) are related to each other. In OKP4, this ontology is used to describe the metadata or characteristics of resources and services. Think of it as the "data about the data." This structured information makes it easier for users and systems to understand and interact with resources and services within the OKP4 network. It ensures that everyone speaks the same language when it comes to sharing and accessing digital resources. @@ -126,7 +126,7 @@ Imagine an AI trained, owned and governed by a DAO with governance rules, like d ### What is the web2 alternative today? -The alternative to such a protocol is a set of trusted technical, legal, and financial intermediaries. That’s what we’ve seen being built in various use cases over the last few years, and it’s also very complex and creates a lot of friction. +The alternative to such a protocol is a set of trusted technical, legal, and financial intermediaries. That's what we've seen being built in various use cases over the last few years, and it's also very complex and creates a lot of friction. ### Who are OKP4 web3 competitors? @@ -141,21 +141,21 @@ If you want to delve into the Governance aspect, this series of 3 articles is a ### I'm still struggling to understand what makes OKP4 different. How is it different from Ocean Protocol, for example? -Ocean Protocol enables decentralized data exchange and monetization solutions through a marketplace. It’s a great solution if you want to sell datasets. +Ocean Protocol enables decentralized data exchange and monetization solutions through a marketplace. It's a great solution if you want to sell datasets. -OKP4 enables custom governance for complex workflows and applications powered by shared data & services. It’s a great solution if you want your data, algorithms or resources to contribute to any application on your own terms, with on-chain rules. +OKP4 enables custom governance for complex workflows and applications powered by shared data & services. It's a great solution if you want your data, algorithms or resources to contribute to any application on your own terms, with on-chain rules. -While Ocean Protocol focuses on tokenizing datasets, we’re focused on building the right infrastructure and tools to enforce customized rules/permissions on any data or service. It goes way beyond what Ocean does as it has new primitives to build anything on top. OKP4 will leverage data from Ocean Protocol but will not be limited to that at all. +While Ocean Protocol focuses on tokenizing datasets, we're focused on building the right infrastructure and tools to enforce customized rules/permissions on any data or service. It goes way beyond what Ocean does as it has new primitives to build anything on top. OKP4 will leverage data from Ocean Protocol but will not be limited to that at all. ### Is OKP4 limited to web2 B2B applications? -Not at all! The protocol we’ve developed introduces innovative primitives suitable for a wide range of applications. Initially, we targeted B2B ones. While these have notable usage volume, they’re not seamless and pose bootstrapping challenges. +Not at all! The protocol we've developed introduces innovative primitives suitable for a wide range of applications. Initially, we targeted B2B ones. While these have notable usage volume, they're not seamless and pose bootstrapping challenges. -Consequently, we’ve prioritized no-code interfaces for data & service providers, Zone creators, governance participants, and workflow consumers. Our aim is to allow individuals to easily define rules and either contribute to or consume resources. These user-friendly interfaces empower developers to seamlessly experiment with off-chain coordination. While many community-driven innovations arise from our members (such as Marketplaces, Collaborative Research DAOs, Personal Data Vaults, etc.), our primary focus for adoption is collaborative AI training. The OKP4 Portal will feature what we term "AI Factory" templates. +Consequently, we've prioritized no-code interfaces for data & service providers, Zone creators, governance participants, and workflow consumers. Our aim is to allow individuals to easily define rules and either contribute to or consume resources. These user-friendly interfaces empower developers to seamlessly experiment with off-chain coordination. While many community-driven innovations arise from our members (such as Marketplaces, Collaborative Research DAOs, Personal Data Vaults, etc.), our primary focus for adoption is collaborative AI training. The OKP4 Portal will feature what we term "AI Factory" templates. These templates detail rules for data, algorithms, and infrastructure contributions. They can be customized to meet specific needs related to quality, privacy, transparency, or reliability. Furthermore, they introduce governance frameworks and custom tokens to manage or incentivize AI DAO participants. -The ultimate outcome is a collectively trained, owned, and governed AI, fueled by a network of incentivized contributors. At its core, OKP4 is more than a tool for businesses. It’s a protocol for collective innovation and coordination, inviting individuals from all backgrounds to contribute and add value. +The ultimate outcome is a collectively trained, owned, and governed AI, fueled by a network of incentivized contributors. At its core, OKP4 is more than a tool for businesses. It's a protocol for collective innovation and coordination, inviting individuals from all backgrounds to contribute and add value. ## Architecture @@ -173,13 +173,13 @@ As the basis of the architecture, the blockchain act as a source of truth in the We established OKP4 as a new blockchain for several strategic reasons: -Independence from Existing Chains: We aimed to operate without being reliant on the decisions or the pace of development of Ethereum or any other Layer 1 (L1) or Layer 2 (L2) solutions. This autonomy ensures that our direction isn’t dictated by external factors. +Independence from Existing Chains: We aimed to operate without being reliant on the decisions or the pace of development of Ethereum or any other Layer 1 (L1) or Layer 2 (L2) solutions. This autonomy ensures that our direction isn't dictated by external factors. Customization and Speed: By creating our own blockchain, we can move at a pace that aligns with our vision and can tailor the chain specifically for our use case. Our primary goal is to create an optimal environment for off-chain resource governance. -Complete Environment Control: We wanted the freedom to fully customize our environment. We didn’t want to be confined by the limitations of smart contract virtual machines. This also allows us to set specific requirements for our validators, both in terms of software and hardware. +Complete Environment Control: We wanted the freedom to fully customize our environment. We didn't want to be confined by the limitations of smart contract virtual machines. This also allows us to set specific requirements for our validators, both in terms of software and hardware. -Sovereign Governance: We prioritize having sovereignty over our blockchain’s governance. We envision a true DAO consisting of validators and token holders. This structure fosters an environment ripe for experimentation and innovation. +Sovereign Governance: We prioritize having sovereignty over our blockchain's governance. We envision a true DAO consisting of validators and token holders. This structure fosters an environment ripe for experimentation and innovation. Balancing the Trilemma: Every blockchain faces the scalability-decentralization-security trilemma. With our own chain, we can choose our position on this spectrum, balancing these factors according to our priorities. @@ -210,9 +210,9 @@ In essence, OKP4 is designed for its specialization in resource sharing and its As we mentioned previously, the number one reason to build a Cosmos chain is sovereignty, more precisely, interoperable sovereignty. For us, Cosmos is a design pattern we leveraged with a set of components that suit our needs: CometBFT, CosmosSDK, IBC, CosmWasm. We then built our custom modules and smart contracts to make the OKP4 blockchain specifically designed for our protocol. -There’s certainly drama around the Cosmos Hub and the ecosystem as a whole because it’s democratic and governance-intensive, and that’s the issue with sovereignty. +There's certainly drama around the Cosmos Hub and the ecosystem as a whole because it's democratic and governance-intensive, and that's the issue with sovereignty. -We see ourselves very independently from that drama. Many projects in the interchain have managed to be seen as industry-specific layer-1s and not attached to the Cosmos brand; we strive for the same. We’re part of the Interchain that wouldn’t be possible without Cosmos, but we’re independent from Cosmos. +We see ourselves very independently from that drama. Many projects in the interchain have managed to be seen as industry-specific layer-1s and not attached to the Cosmos brand; we strive for the same. We're part of the Interchain that wouldn't be possible without Cosmos, but we're independent from Cosmos. We definitely think this trend will accelerate as interoperability becomes seamless and the ecosystems blur on the liquidity side (see IBC everywhere) and UX side (see Metamask snaps). @@ -238,7 +238,7 @@ The side of the "Stone": The instantiation of the smart contract engraves in sto ### What is the Pactum smart contract in a few words? -The Pactum smart contract is designed to streamline and automate the execution of agreements involving multiple parties. In contrast to the Law-Stone, which primarily focuses on expressing or stating the law, the Pactum’s core function is to actively execute the law by the predefined terms outlined in the agreement. By leveraging the power of Prolog provided by the Law-Stone smart contract, the Pactum facilitates the seamless implementation of agreed-upon terms and ensures the proper enforcement of contractual obligations. +The Pactum smart contract is designed to streamline and automate the execution of agreements involving multiple parties. In contrast to the Law-Stone, which primarily focuses on expressing or stating the law, the Pactum's core function is to actively execute the law by the predefined terms outlined in the agreement. By leveraging the power of Prolog provided by the Law-Stone smart contract, the Pactum facilitates the seamless implementation of agreed-upon terms and ensures the proper enforcement of contractual obligations. This smart contract is essential in the governance framework implemented in the OKP4 protocol, particularly in regulating the orchestration of digital resources, which involves many parties. @@ -266,11 +266,11 @@ More details about Token Model [here](https://docs.okp4.network/whitepaper/token ### Can price volatility of the KNOW token be a problem? -The KNOW price volatility can become a problem when it’s used for payment. Imagine a dataset or service providers denominates its price in KNOW. If the KNOW values goes +50%, then it goes the same of providers who may see their resources less used because 50% more expensive. An external service can be used to define another unit of account (like the dollar) and have the $KNOW value updated every time their resource is consumed. +The KNOW price volatility can become a problem when it's used for payment. Imagine a dataset or service providers denominates its price in KNOW. If the KNOW values goes +50%, then it goes the same of providers who may see their resources less used because 50% more expensive. An external service can be used to define another unit of account (like the dollar) and have the $KNOW value updated every time their resource is consumed. ### Decentralization seems to add complexity. Does that get offset but the open composability? -It’s important to state that we’re building a protocol: trustless and general primitives that form a foundation for many more layers that can be built on top. OKP4 is the first settlement layer for off-chain workflows using shared resources. +It's important to state that we're building a protocol: trustless and general primitives that form a foundation for many more layers that can be built on top. OKP4 is the first settlement layer for off-chain workflows using shared resources. Decentralization introduces complexity but provides many benefits. Composability is really important; all solutions that exist today are not tech-agnostic and rely on specific cloud infrastructure solutions. OKP4, as a protocol, is designed to become the binder between all the existing solutions that are worth sharing and making interoperable. @@ -281,18 +281,18 @@ The ontology primitives provide an open and permissionless source of facts and k The token is also an important element; its design creates real cash flow for token holders, creating a flywheel of incentives for the ecosystem, attracting open-source contributors, data and service providers, and consumers, and essentially bringing the interest of many developers who like a new playground. -OKP4 is a sandbox for human coordination experiments in the off-chain world. You can’t build that and create proper incentives without decentralization. The complexity that is introduced is worth it, and our mission is, and will be, as a team and community, to abstract the underlying complexity with beautiful no-code interfaces & applications. +OKP4 is a sandbox for human coordination experiments in the off-chain world. You can't build that and create proper incentives without decentralization. The complexity that is introduced is worth it, and our mission is, and will be, as a team and community, to abstract the underlying complexity with beautiful no-code interfaces & applications. ## Utilization ### Can you describe how the data transfer work ? -First, let me remind that the data is off-chain, only the data’s metadata (it’s description) is on-chain and integrated in the ontology. +First, let me remind that the data is off-chain, only the data's metadata (it's description) is on-chain and integrated in the ontology. When a consumer interacts with the OKP4 blockchain, they might request access to multiple datasets and algorithms. The protocol checks if the conditions set by the data or algorithm providers are satisfied. If they are, the transaction is validated on-chain (the [Pactum smart contract](https://docs.okp4.network/whitepaper/architecture#pactum-managing-agreements) ensures conditions are met, including retributions for providers), with the workflow being described like any service in the ontology (leveraging the [Cognitarium smart contract](https://docs.okp4.network/whitepaper/architecture#cognitarium-semantic-data-storage)). -Subsequent to this validation, an [off-chain orchestration service](https://docs.okp4.network/whitepaper/architecture#orchestration), exemplified by workflow engines like [Argo](https://argoproj.github.io/argo-workflows/), takes over. This service acts as a gatekeeper for resources, relying exclusively on blockchain validation events to process the consumer’s request. It’s important to note that this orchestration service doesn’t inherently trust any party; it solely trusts the blockchain’s validation. +Subsequent to this validation, an [off-chain orchestration service](https://docs.okp4.network/whitepaper/architecture#orchestration), exemplified by workflow engines like [Argo](https://argoproj.github.io/argo-workflows/), takes over. This service acts as a gatekeeper for resources, relying exclusively on blockchain validation events to process the consumer's request. It's important to note that this orchestration service doesn't inherently trust any party; it solely trusts the blockchain's validation. -The orchestration service’s role is to create new insights or knowledge by utilizing resources from different providers. Once the processing is complete, the service reports back to the blockchain, ensuring that the execution status is logged reliably and that any due payments are processed. +The orchestration service's role is to create new insights or knowledge by utilizing resources from different providers. Once the processing is complete, the service reports back to the blockchain, ensuring that the execution status is logged reliably and that any due payments are processed. Data transfer is a crucial part of this workflow. Depending on the rules and services invoked, data is transferred between the relevant providers and consumers. The orchestration service facilitates this by fetching data through APIs and requesting services that perform computations on one or more datasets as required. @@ -320,7 +320,7 @@ OKP4 functions as a decentralized coordinator where: ### How does OKP4 ensures the execution and validation of sharing rules? -The protocol itself is primarily an infrastructure that allows a resource "provider" to reference and define usage rules (in Prolog, which offers better expressiveness compared to other languages), and for a "consumer" to make usage requests based on these rules. What the protocol guarantees is the transparency of the rules and the correct assessment (validation or not) of these rules. However, there is indeed the question of verifying the actual "real" sharing action. The current approach is an open-source off chain "orchestrator" developed by the OKP4 Association team that provides access to resources (based on requests validated onchain) and then reports the successful execution back on chain. So, yes, there’s an element of centralization! We aim to have multiple orchestrators, deployed by external entities, and even decentralize that process at some point. Both "providers" and "consumers" will be able to choose the one they "trust" or deploy their own. +The protocol itself is primarily an infrastructure that allows a resource "provider" to reference and define usage rules (in Prolog, which offers better expressiveness compared to other languages), and for a "consumer" to make usage requests based on these rules. What the protocol guarantees is the transparency of the rules and the correct assessment (validation or not) of these rules. However, there is indeed the question of verifying the actual "real" sharing action. The current approach is an open-source off chain "orchestrator" developed by the OKP4 Association team that provides access to resources (based on requests validated onchain) and then reports the successful execution back on chain. So, yes, there's an element of centralization! We aim to have multiple orchestrators, deployed by external entities, and even decentralize that process at some point. Both "providers" and "consumers" will be able to choose the one they "trust" or deploy their own. More info [here](https://docs.okp4.network/whitepaper/architecture#trusted-parties-considerations) diff --git a/docs/nft-tutorial/fr.md b/docs/nft-tutorial/fr.md index b77ed322c79..db651d04173 100644 --- a/docs/nft-tutorial/fr.md +++ b/docs/nft-tutorial/fr.md @@ -36,7 +36,7 @@ Vous pouvez utiliser une de ces solutions: Connectez vous avec l'adresse fournie par OKP4 en utilisant le fichier fournit dans le mail qui vous a été envoyé: - Décryptez le fichier reçu par mail à l'aide du mot de passe accessible par le lien temporaire fournit dans le même mail -- Cliquez sur l’icône Keplr de votre navigateur : +- Cliquez sur l'icône Keplr de votre navigateur : ![keplr icon](/img/content/nft-tutorial/keplr-icon.webp) - Vous devriez voir l'écran suivant: ![create account](/img/content/nft-tutorial/account-creation-keplr.webp) @@ -45,7 +45,7 @@ Connectez vous avec l'adresse fournie par OKP4 en utilisant le fichier fournit d - Entrer un nom de compte, par exemple `nom + temporary okp4 wallet` - Choisissez un mot pour Keplr si cela vous est demandé. -Maintenant, vous pouvez cliquer sur l’icône de l’extension Keplr +Maintenant, vous pouvez cliquer sur l'icône de l'extension Keplr - Sélectionnez la blockchain `Stargaze` @@ -71,7 +71,7 @@ Vous devriez être redirigé vers la page de votre NFT : Si vous en avez déjà un, vous pouvez sauter cette partie - Ouvrez Keplr -- Cliquer sur l’icône utilisateur (en haut à droite) +- Cliquer sur l'icône utilisateur (en haut à droite) - `+ Add account` - `Create new account` - Suivez les instructions : donnez un nom à votre portefeuille personnel, notez précieusement le mnémonique, puis prouvez que vous pouvez le retrouver. diff --git a/docs/nodes/kms.md b/docs/nodes/kms.md index 5eb959b34c7..ec559324614 100644 --- a/docs/nodes/kms.md +++ b/docs/nodes/kms.md @@ -56,7 +56,7 @@ export RUSTFLAGS=-Ctarget-feature=+aes,+ssse3 ::: -We are ready to install KMS. There are 2 ways to do this: compile from source or install with Rusts cargo-install. We’ll use the first option. +We are ready to install KMS. There are 2 ways to do this: compile from source or install with Rusts cargo-install. We'll use the first option. ### Compile from source code diff --git a/docs/tutorials/keplr-1.mdx b/docs/tutorials/keplr-1.mdx index 5ced8d70c77..71f999edefe 100644 --- a/docs/tutorials/keplr-1.mdx +++ b/docs/tutorials/keplr-1.mdx @@ -12,7 +12,7 @@ This tutorial will guide you through setting up an OKP4 account, obtaining test [Keplr](https://www.keplr.app/) is a popular and widely used wallet application designed specifically for interacting with Cosmos-based blockchains. It is a user-friendly interface allowing individuals to securely manage their accounts, interact with decentralized applications (DApps), and perform various transactions on Cosmos-based blockchains like OKP4. -1. [Visit the Keplr Wallet website](https://www.keplr.app/download) and **download the desktop browser extension** of your choice: Chrome or Firefox. For this tutorial, we’ll install it on Brave (Chrome extension). +1. [Visit the Keplr Wallet website](https://www.keplr.app/download) and **download the desktop browser extension** of your choice: Chrome or Firefox. For this tutorial, we'll install it on Brave (Chrome extension). 2. Click on the Keplr extension icon. A setup page opens. Click on "**Create a new wallet**". @@ -27,10 +27,10 @@ Then provide a name for your wallet (you can type anything you want), and set a ![New Keplr wallet parameters](/img/content/tutorials/keplr-3.webp) -5. Then Keplr asks you to select chains. Don’t worry; select any chain you want. +5. Then Keplr asks you to select chains. Don't worry; select any chain you want. -6. And you’re done! Your account has been created. You can recover your wallet by following these steps but choosing "Import recovery phrase" in step 3. -Optionally, you can **pin Keplr for easy access**: click the ‘Extensions’ button, locate Keplr, and then click the ‘Pin’. +6. And you're done! Your account has been created. You can recover your wallet by following these steps but choosing "Import recovery phrase" in step 3. +Optionally, you can **pin Keplr for easy access**: click the ‘Extensions' button, locate Keplr, and then click the ‘Pin'. ![New Keplr wallet created](/img/content/tutorials/keplr-4.webp) @@ -50,7 +50,7 @@ The official OKP4 faucet provides a way to obtain test tokens for experimentatio ![Approve OKP4 network adding to Keplr](/img/content/tutorials/keplr-6.webp) -4. And you’re done! Congrats, you received $KNOW tokens 🥳 +4. And you're done! Congrats, you received $KNOW tokens 🥳 ## Check your $KNOW balance and get your OKP4 address @@ -58,12 +58,12 @@ The official OKP4 faucet provides a way to obtain test tokens for experimentatio
-2. Let’s check you received the $KNOW tokens. Click on the Keplr extension button, look for "OKP4 nemeton" in the list or type "okp4" in the "Search for asset or chain" field to see it easily. Alright, there’s 1 $KNOW in the wallet: +2. Let's check you received the $KNOW tokens. Click on the Keplr extension button, look for "OKP4 nemeton" in the list or type "okp4" in the "Search for asset or chain" field to see it easily. Alright, there's 1 $KNOW in the wallet:
3. Your wallet is identified by an address derived from your recovery phrase. To receive $KNOW tokens, you should provide your OKP4 address to the spender. -Here’s how you can have your OKP4 address: click on the Keplr extension icon and "Copy address". Look for "OKP4 nemeton" in the list or type ‘okp4’ in the "Search for a chain" field to see it easily. Click on "Copy" to have the OKP4 address, which is in the following form: `okp41mkc58ag3am7wvzze4k6pddl8pyr6dql5htuxyu` +Here's how you can have your OKP4 address: click on the Keplr extension icon and "Copy address". Look for "OKP4 nemeton" in the list or type ‘okp4' in the "Search for a chain" field to see it easily. Click on "Copy" to have the OKP4 address, which is in the following form: `okp41mkc58ag3am7wvzze4k6pddl8pyr6dql5htuxyu`
@@ -81,11 +81,11 @@ Now that you have $KNOW tokens in your Keplr Wallet, you can send them to other ### Check your transaction with the explorer -You can look at the [OKP4 explorer](https://explore.okp4.network/OKP4%20testnet) to check all transactions you executed. Click the "Search" button (up on the right) and provide your OKP4 address. The $KNOW payment you’ve sent will appear in the Transactions part. +You can look at the [OKP4 explorer](https://explore.okp4.network/OKP4%20testnet) to check all transactions you executed. Click the "Search" button (up on the right) and provide your OKP4 address. The $KNOW payment you've sent will appear in the Transactions part. ![Look for your transaction in the OKP4 explorer](/img/content/tutorials/keplr-13.webp) -## Recap’ +## Recap' To get started, you should: @@ -93,6 +93,6 @@ To get started, you should: - Safely store the recovery phrase, the secret way to use your account - Add the OKP4 network and get $KNOW tokens via the official faucet -Once you’re done, you can interact with the blockchain; you can send $KNOW tokens to another wallet using the "Send" button in Keplr, for example. Checkout the explorer to see the transactions status and more. +Once you're done, you can interact with the blockchain; you can send $KNOW tokens to another wallet using the "Send" button in Keplr, for example. Checkout the explorer to see the transactions status and more. Congratulations! You have successfully learned how to set up an OKP4 account using Keplr Wallet, obtain test tokens from the official faucet, and send transactions. With this knowledge, you can actively participate in the OKP4 blockchain ecosystem, explore its features, build new applications and interact with deployed smart contracts. Enjoy your journey into the world of OKP4! diff --git a/docs/whitepaper/introduction.md b/docs/whitepaper/introduction.md index 201014e1993..b7b0b9aecb5 100644 --- a/docs/whitepaper/introduction.md +++ b/docs/whitepaper/introduction.md @@ -13,7 +13,7 @@ But what if we could unlock knowledge creation through trusted and incentivized ## We need knowledge In light of pressing issues such as climate change and natural resource scarcity, the urgent need for knowledge creation becomes evident. -We don’t know how to go to Mars, we don’t know how to predict cancer, we don’t know how to farm in a desert, and we don’t know what present would please our mother for her birthday. +We don't know how to go to Mars, we don't know how to predict cancer, we don't know how to farm in a desert, and we don't know what present would please our mother for her birthday. Knowledge is the limiting factor to the common good for individuals, companies and society at large. Knowledge, in its various forms, is a powerful catalyst for progress and innovation, impacting all areas of human endeavor from economic growth and social development to technological advancement and cultural enrichment. @@ -44,7 +44,7 @@ The pace of technological advancement has far outstripped our ability to fully e ![intro-2-balance](/img/content/whitepaper/intro-2-balance.webp) Trust plays a critical role in the decision to share resources. In a world where data breaches and misuse are all too common, there's a legitimate lack of trust. Providers need to be confident that their consents and rules will be respected. Furthermore, the risk of data becoming accessible to untrusted parties or being mismanaged by third parties is a significant deterrent. -The question of perceived value and incentives is another major barrier. The value proposition of sharing digital resources is often unclear. Even when it is explicit, the conventional data marketplace model is flawed. Fixed prices on data are suboptimal because the true value lies in the knowledge generated from the data, not the data itself. This muddled value perception creates a lack of desire to share resources that doesn’t outweigh the risks and costs. +The question of perceived value and incentives is another major barrier. The value proposition of sharing digital resources is often unclear. Even when it is explicit, the conventional data marketplace model is flawed. Fixed prices on data are suboptimal because the true value lies in the knowledge generated from the data, not the data itself. This muddled value perception creates a lack of desire to share resources that doesn't outweigh the risks and costs. Technical complexities and costs present a substantial barrier. The current landscape is fragmented, making resource sharing a complex and costly endeavor. The myriad platforms, protocols, and standards introduce technical challenges, while data integration and transformation carry significant costs.These technical difficulties further contribute to this contributes to the siloing of resources and knowledge ![intro-3-silot](/img/content/whitepaper/intro-3-silot.webp) @@ -80,7 +80,7 @@ Openness is a critical principle to ensure transparency, encourage innovation, a - **Auditability:** The openness of the decentralized infrastructure allows anyone to audit the system. This means that any interested person or organization can verify the system's proper functioning, including compliance with access and sharing rules. This auditability enhances transparency and trust in the system, which is essential for ethical and equitable sharing of digital resources. In response to these challenges, blockchain technologies play a crucial role. They can offer a secure, transparent, and decentralized infrastructure, extending their value beyond financial transactions to peer-to-peer data exchanges and secure collaborations. This is increasingly important in a data-driven world where privacy, security, and the ability to customize solutions are paramount. Hence, the relevance of blockchain technology in creating a more open and interconnected world of secure data sharing is profound, and its potential stretches across numerous sectors. -Given the sensitivity of the data and knowledge sector, the infrastructure must be as resilient and neutral as possible. Moreover, due to the intricacy of this field, there's a specific and adaptive design requirement that fosters sharing while managing these complexities efficiently. That’s why the **modularity aspect** is essential. +Given the sensitivity of the data and knowledge sector, the infrastructure must be as resilient and neutral as possible. Moreover, due to the intricacy of this field, there's a specific and adaptive design requirement that fosters sharing while managing these complexities efficiently. That's why the **modularity aspect** is essential. ### Modularity Challenges diff --git a/docs/whitepaper/solution.md b/docs/whitepaper/solution.md index 130df540599..86904fb9a88 100644 --- a/docs/whitepaper/solution.md +++ b/docs/whitepaper/solution.md @@ -76,7 +76,7 @@ All resources, services and Zones are found within the same universe, the Datave Zones can be nested and overlapping, as one resource or service can participate in many Zones, and many applications can be built on top of one Zone. The whole is greater than the sum of its parts: **this is the Dataverse.** -**What’s the purpose ?** +**What's the purpose ?** **Creating a general purpose Ecosysteme that enables XaaS integration** Anything that is presented to the Protocol as a Service, whatever it does, wherever it is hosted or deployed (in the cloud or on premise), whoever provides it, it can be used by the Protocol. Therein lies **the integration power of the protocol**, which brings infinite scalability and extensibility to the entire OKP4 ecosystem. The description of each resources referenced on the protocol ensures their proper processing by the different entities of the protocol. @@ -127,12 +127,12 @@ BI tools can be effectively employed to comprehend the data shared within the zo Given that resources are not directly attached to a specific Zone, but rather deemed compatible with certain Zones based on their rules and conditions, it is crucial for the OKP4 protocol to accurately **represent and interpret** these diverse concepts within its framework. Furthermore, the protocol must efficiently distinguish and understand various consent rules, their dependencies, and hierarchies. Consequently, the protocol have to **express as clearly as possible the context**, the meaning of the concepts and their relationships. To really leverage the power of knowledge, OKP4 protocol interprets all these entities and concept within a universal language taking into account **the semantic aspect** of each of them. -Let’s take an illustration to clearly understand. +Let's take an illustration to clearly understand. ![solution-6](/img/content/whitepaper/solution-6.webp) The structural language only gives a definition of concepts in a wide range. However, the semantic language could give **the meaning** of concepts, properties, relationships and entities. -The expressiveness of the language enhances **the disambiguation and the interoperability** between entities in a computable form. To give meaning to each concept and entity in the protocol, we’ve decided to encode an ontology to represent and persist the information on the blockchain. +The expressiveness of the language enhances **the disambiguation and the interoperability** between entities in a computable form. To give meaning to each concept and entity in the protocol, we've decided to encode an ontology to represent and persist the information on the blockchain. **Ontology overview:** In a broader, philosophical context, ontology is concerned with the nature of being, structuring a system of universal categories and their intrinsic relationships to explain existence itself.