Overview Proof-of-Inference (PoI) is a cryptographic verification system that enables trustless verification of AI model outputs on the blockchain. By creating hash-based commitments for inputs, outputs, and model identifiers, Nexis ensures that inference results can be verified without requiring the full computation to be replicated on-chain. 
Cryptographic Commitments Hash-based commitments for inputs, outputs, and models 
IPFS Integration Decentralized storage for proof artifacts 
On-Chain Verification Smart contract attestation and validation 
Economic Security Stake-backed guarantees with slashing 
Architecture The Proof-of-Inference system consists of several interconnected components: 
InferenceCommitment Structure The core data structure for proof-of-inference is the InferenceCommitment struct: 
Solidity Definition
TypeScript Interface
Python Type
struct  InferenceCommitment  {     uint256  agentId;         // Unique agent identifier     bytes32  inputHash;       // keccak256 hash of input data     bytes32  outputHash;      // keccak256 hash of inference output     bytes32  modelHash;       // keccak256 hash of model identifier     uint256  taskId;          // Associated task ID (0 if standalone)     address  reporter;        // Address that submitted the commitment     string  proofURI;         // IPFS URI containing full proof artifacts     uint64  timestamp;        // Block timestamp of commitment } 
Field Descriptions Field Type Description Purpose agentIduint256 Unique identifier for the AI agent Links commitment to registered agent inputHashbytes32 keccak256 hash of input data Commitment to input without revealing it outputHashbytes32 keccak256 hash of inference result Commitment to output for verification modelHashbytes32 Hash of model identifier/version Ensures specific model was used taskIduint256 Associated task ID (0 if none) Links to task execution framework reporteraddress Submitter’s Ethereum address Authorization and accountability proofURIstring IPFS URI to proof artifacts Reference to detailed verification data timestampuint64 Block timestamp Temporal ordering and deadlines 
Hash Commitment Scheme The input hash is a cryptographic commitment to the input data: 
TypeScript
Python
JavaScript
import  {  ethers  }  from  "ethers" ; function  computeInputHash ( inputData :  any ) :  string  {   // Serialize input data to JSON   const  serialized  =  JSON . stringify ( inputData ,  Object . keys ( inputData ). sort ());   // Compute keccak256 hash   const  inputHash  =  ethers . utils . keccak256 (     ethers . utils . toUtf8Bytes ( serialized )   );   return  inputHash ; } // Example usage const  input  =  {   prompt:  "Generate an image of a sunset" ,   parameters:  {     width:  512 ,     height:  512 ,     steps:  50   } }; const  inputHash  =  computeInputHash ( input ); console . log ( `Input Hash:  ${ inputHash } ` ); 
Output Hash The output hash commits to the inference result: 
function  computeOutputHash ( output :  any ) :  string  {   if  ( typeof  output  ===  "string" ) {     // For text outputs     return  ethers . utils . keccak256 ( ethers . utils . toUtf8Bytes ( output ));   }  else  if  ( output  instanceof  Uint8Array  ||  Buffer . isBuffer ( output )) {     // For binary outputs (images, audio, etc.)     return  ethers . utils . keccak256 ( output );   }  else  if  ( typeof  output  ===  "object" ) {     // For structured outputs     const  serialized  =  JSON . stringify ( output ,  Object . keys ( output ). sort ());     return  ethers . utils . keccak256 ( ethers . utils . toUtf8Bytes ( serialized ));   }  else  {     throw  new  Error ( "Unsupported output type" );   } } // Example: Text output const  textOutput  =  "The sunset paints the sky in brilliant orange and pink hues." ; const  textHash  =  computeOutputHash ( textOutput ); // Example: Image output (binary data) const  imageBuffer  =  fs . readFileSync ( "generated_sunset.png" ); const  imageHash  =  computeOutputHash ( imageBuffer ); // Example: Structured output const  structuredOutput  =  {   result:  "success" ,   data:  {     classification:  "sunset" ,     confidence:  0.98   } }; const  structuredHash  =  computeOutputHash ( structuredOutput ); 
Model Hash The model hash identifies the specific model version used: 
function  computeModelHash ( modelInfo :  {   name :  string ;   version :  string ;   checksum ?:  string ; }) :  string  {   // Create unique model identifier   const  identifier  =  ` ${ modelInfo . name } : ${ modelInfo . version } ` ;   // If model checksum is available, include it   const  fullIdentifier  =  modelInfo . checksum     ?  ` ${ identifier } : ${ modelInfo . checksum } `     :  identifier ;   return  ethers . utils . keccak256 ( ethers . utils . toUtf8Bytes ( fullIdentifier )); } // Example usage const  modelHash  =  computeModelHash ({   name:  "stable-diffusion" ,   version:  "v2.1" ,   checksum:  "sha256:abc123..." }); console . log ( `Model Hash:  ${ modelHash } ` ); 
Recording Inference The recordInference function creates an on-chain commitment: 
Implementation import  {  ethers  }  from  "ethers" ; import  {  NexisAgents  }  from  "@nexis-network/sdk" ; async  function  recordInference (   agentId :  bigint ,   inputData :  any ,   outputData :  any ,   modelInfo :  {  name :  string ;  version :  string  },   taskId :  bigint  =  0 n ) {   // 1. Compute hashes   const  inputHash  =  computeInputHash ( inputData );   const  outputHash  =  computeOutputHash ( outputData );   const  modelHash  =  computeModelHash ( modelInfo );   // 2. Prepare proof artifacts   const  proofArtifact  =  {     input:  inputData ,     output:  outputData ,     model:  modelInfo ,     metadata:  {       timestamp:  Date . now (),       agentId:  agentId . toString (),       executionTime:  1234 ,  // milliseconds       gpuModel:  "NVIDIA A100" ,       framework:  "PyTorch 2.0"     }   };   // 3. Upload to IPFS   const  proofURI  =  await  uploadToIPFS ( proofArtifact );   console . log ( `Proof uploaded to:  ${ proofURI } ` );   // 4. Record on-chain   const  agents  =  new  NexisAgents ( AGENTS_ADDRESS ,  signer );   const  tx  =  await  agents . recordInference (     agentId ,     inputHash ,     outputHash ,     modelHash ,     taskId ,     proofURI   );   const  receipt  =  await  tx . wait ();   // 5. Extract inference ID from event   const  event  =  receipt . events ?. find ( e  =>  e . event  ===  "InferenceRecorded" );   const  inferenceId  =  event ?. args ?. inferenceId ;   console . log ( `Inference recorded with ID:  ${ inferenceId } ` );   return  inferenceId ; } // Helper function to upload to IPFS async  function  uploadToIPFS ( data :  any ) :  Promise < string > {   const  response  =  await  fetch ( "https://ipfs.infura.io:5001/api/v0/add" , {     method:  "POST" ,     body:  JSON . stringify ( data ),     headers:  {       "Content-Type" :  "application/json" ,       "Authorization" :  `Basic  ${ Buffer . from (         ` ${ INFURA_PROJECT_ID } : ${ INFURA_PROJECT_SECRET } `       ). toString ( "base64" ) } `     }   });   const  result  =  await  response . json ();   return  `ipfs:// ${ result . Hash } ` ; } 
Event Emission When an inference is recorded, the contract emits an event: 
event  InferenceRecorded (     uint256  indexed  agentId ,     bytes32  indexed  inferenceId ,     bytes32  indexed  inputHash ,     bytes32  outputHash ,     bytes32  modelHash ,     uint256  taskId ,     address  reporter ,     string  proofURI ); This event can be indexed and listened to by verifiers, task systems, and monitoring tools. 
Verification Process The verification process involves multiple steps: 
Attestation Implementation TypeScript - Verifier
Python - Verifier
import  {  ethers  }  from  "ethers" ; import  {  NexisAgents  }  from  "@nexis-network/sdk" ; import  fetch  from  "node-fetch" ; interface  ReputationDelta  {   dimension :  string ;   delta :  number ;   reason :  string ; } async  function  attestInference (   inferenceId :  string ,   verifyResult :  boolean ,   attestationURI :  string ,   reputationDeltas :  ReputationDelta [] ) {   // 1. Fetch the inference commitment   const  agents  =  new  NexisAgents ( AGENTS_ADDRESS ,  signer );   const  [ commitment ,  _ ]  =  await  agents . getInference ( inferenceId );   console . log ( `Verifying inference  ${ inferenceId }  for agent  ${ commitment . agentId } ` );   // 2. Download proof artifacts from IPFS   const  proofData  =  await  fetchFromIPFS ( commitment . proofURI );   // 3. Validate hash commitments   const  inputHashValid  =  computeInputHash ( proofData . input )  ===  commitment . inputHash ;   const  outputHashValid  =  computeOutputHash ( proofData . output )  ===  commitment . outputHash ;   const  modelHashValid  =  computeModelHash ( proofData . model )  ===  commitment . modelHash ;   if  ( ! inputHashValid  ||  ! outputHashValid  ||  ! modelHashValid ) {     console . error ( "Hash validation failed!" );     verifyResult  =  false ;   }   // 4. Perform domain-specific verification   // (This is where custom verification logic goes)   const  customVerificationPassed  =  await  performCustomVerification ( proofData );   // 5. Prepare reputation deltas   const  deltas :  ReputationDelta []  =  [];   if  ( verifyResult  &&  customVerificationPassed ) {     deltas . push ({       dimension:  ethers . utils . id ( "accuracy" ),       delta:  10 ,       reason:  "Successful inference verification"     });     deltas . push ({       dimension:  ethers . utils . id ( "reliability" ),       delta:  5 ,       reason:  "Timely submission"     });   }  else  {     deltas . push ({       dimension:  ethers . utils . id ( "accuracy" ),       delta:  - 20 ,       reason:  "Failed verification"     });     deltas . push ({       dimension:  ethers . utils . id ( "trustworthiness" ),       delta:  - 10 ,       reason:  "Invalid inference commitment"     });   }   // 6. Submit attestation on-chain   const  tx  =  await  agents . attestInference (     inferenceId ,     verifyResult  &&  customVerificationPassed ,     attestationURI ,     deltas   );   await  tx . wait ();   console . log ( `Attestation submitted for inference  ${ inferenceId } ` ); } async  function  fetchFromIPFS ( uri :  string ) :  Promise < any > {   const  cid  =  uri . replace ( "ipfs://" ,  "" );   const  response  =  await  fetch ( `https://ipfs.io/ipfs/ ${ cid } ` );   return  await  response . json (); } async  function  performCustomVerification ( proofData :  any ) :  Promise < boolean > {   // Custom verification logic   // Examples:   // - Re-run the model with same inputs   // - Check output format and validity   // - Validate against known constraints   // - Compare with ensemble predictions   return  true ;  // Placeholder } 
Attestation Event event  InferenceAttested (     bytes32  indexed  inferenceId ,     uint256  indexed  agentId ,     uint256  indexed  taskId ,     address  verifier ,     bool  success ,     string  uri ); IPFS Integration Proof Artifact Structure The proof URI points to a comprehensive artifact on IPFS: 
{   "version" :  "1.0" ,   "inference" : {     "inferenceId" :  "0x1234..." ,     "agentId" :  "12345" ,     "timestamp" :  1234567890   },   "input" : {     "data" :  "..." ,     "format" :  "json" ,     "hash" :  "0xabcd..."   },   "output" : {     "data" :  "..." ,     "format" :  "json|binary" ,     "hash" :  "0xef12..."   },   "model" : {     "name" :  "stable-diffusion" ,     "version" :  "v2.1" ,     "checksum" :  "sha256:..." ,     "hash" :  "0x3456..." ,     "framework" :  "PyTorch" ,     "weights_uri" :  "ipfs://Qm..."   },   "execution" : {     "start_time" :  1234567890 ,     "end_time" :  1234567900 ,     "duration_ms" :  10000 ,     "gpu_model" :  "NVIDIA A100" ,     "gpu_memory_used" :  "12GB" ,     "framework_version" :  "2.0.1"   },   "verification" : {     "reproducibility" : {       "seed" :  42 ,       "deterministic" :  true ,       "environment" :  "..."     },     "metrics" : {       "confidence" :  0.98 ,       "alternative_outputs" : []     }   } } Uploading to IPFS TypeScript - Infura
TypeScript - Pinata
Python - IPFS Client
import  FormData  from  "form-data" ; import  fetch  from  "node-fetch" ; async  function  uploadToInfuraIPFS ( data :  any ) :  Promise < string > {   const  auth  =  Buffer . from (     ` ${ process . env . INFURA_PROJECT_ID } : ${ process . env . INFURA_PROJECT_SECRET } `   ). toString ( "base64" );   const  form  =  new  FormData ();   form . append ( "file" ,  JSON . stringify ( data ));   const  response  =  await  fetch ( "https://ipfs.infura.io:5001/api/v0/add" , {     method:  "POST" ,     headers:  {       "Authorization" :  `Basic  ${ auth } `     },     body:  form   });   const  result  =  await  response . json ();   return  `ipfs:// ${ result . Hash } ` ; } 
Security Considerations Hash Collision Resistance Always use keccak256 (SHA-3) for hash commitments. Never use deprecated hash functions like SHA-1 or MD5. 
The keccak256 hash function provides: 
Collision Resistance : Computationally infeasible to find two inputs with same hashPre-image Resistance : Cannot reverse hash to recover original inputSecond Pre-image Resistance : Cannot find different input with same hash 
Timestamp Validation Always validate timestamps against block timestamps to prevent replay attacks. 
// In Agents.sol timestamp :  uint64 ( block .timestamp) This ensures: 
Temporal ordering of inferences 
Deadline enforcement 
Replay attack prevention 
 
Proof URI Integrity Validate IPFS URIs and implement redundant pinning to prevent data loss. 
Best practices: 
Pin to Multiple Services : Use Infura, Pinata, and local nodesVerify CID : Recompute content hash and compare with URISet Expiry : Implement proof retention policiesBackup Critical Data : Archive important proofs off IPFS 
Advanced Patterns Batch Verification Verify multiple inferences in a single transaction: 
async  function  batchAttest ( inferenceIds :  string []) {   const  attestations  =  await  Promise . all (     inferenceIds . map ( async  id  =>  {       const  [ commitment ]  =  await  agents . getInference ( id );       const  proofData  =  await  fetchFromIPFS ( commitment . proofURI );       const  valid  =  await  verifyProof ( proofData );       return  {  id ,  valid ,  deltas:  computeDeltas ( valid ) };     })   );   // Submit batch attestation   for  ( const  att  of  attestations ) {     await  agents . attestInference ( att . id ,  att . valid ,  "" ,  att . deltas );   } } Probabilistic Verification Reduce costs by verifying a random sample: 
async  function  probabilisticVerification (   inferenceIds :  string [],   sampleRate :  number  =  0.1 ) {   // Randomly sample inferences   const  sampled  =  inferenceIds . filter (()  =>  Math . random ()  <  sampleRate );   // Verify sample   await  batchAttest ( sampled );   // Assume rest are valid if sample passes   // (with reputation decay if not verified) } Optimistic Verification Assume valid unless challenged: 
Gas Optimization Minimize on-chain storage by storing only hashes, not full data. 
// ✅ Efficient: Store only hash bytes32  inputHash; // ❌ Expensive: Store full data on-chain string  inputData; IPFS Optimization 
Use CDN Gateways : Cache frequently accessed proofsImplement Lazy Loading : Load proofs only when neededCompress Artifacts : Use gzip compression for JSON dataBatch Uploads : Combine multiple proofs in single IPFS object 
Event Indexing Use The Graph or similar service for efficient event querying: 
query  RecentInferences ( $agentId :  BigInt ! ) {   inferenceRecordeds (     where : {  agentId :  $agentId  }     orderBy :  timestamp     orderDirection :  desc     first :  10   ) {     inferenceId     inputHash     outputHash     modelHash     taskId     proofURI     timestamp   } } Testing & Development Local Testing import  {  expect  }  from  "chai" ; import  {  ethers  }  from  "hardhat" ; describe ( "Proof-of-Inference" ,  function  () {   it ( "should record and verify inference" ,  async  function  () {     const  [ owner ,  agent ,  verifier ]  =  await  ethers . getSigners ();     // Deploy contracts     const  Agents  =  await  ethers . getContractFactory ( "Agents" );     const  agents  =  await  Agents . deploy ();     await  agents . initialize ( owner . address ,  treasury . address );     // Grant verifier role     await  agents . grantRole ( await  agents . VERIFIER_ROLE (),  verifier . address );     // Register agent     await  agents . connect ( agent ). register ( 1 ,  "metadata" ,  "serviceURI" );     // Record inference     const  inputHash  =  ethers . utils . id ( "input" );     const  outputHash  =  ethers . utils . id ( "output" );     const  modelHash  =  ethers . utils . id ( "model" );     const  tx  =  await  agents . connect ( agent ). recordInference (       1 ,       inputHash ,       outputHash ,       modelHash ,       0 ,       "ipfs://proof"     );     const  receipt  =  await  tx . wait ();     const  event  =  receipt . events ?. find ( e  =>  e . event  ===  "InferenceRecorded" );     const  inferenceId  =  event ?. args ?. inferenceId ;     // Verify inference     await  agents . connect ( verifier ). attestInference (       inferenceId ,       true ,       "ipfs://attestation" ,       []     );     // Check commitment     const  [ commitment ]  =  await  agents . getInference ( inferenceId );     expect ( commitment . inputHash ). to . equal ( inputHash );     expect ( commitment . outputHash ). to . equal ( outputHash );     expect ( commitment . modelHash ). to . equal ( modelHash );   }); }); Troubleshooting Common Issues 
Problem : Computed hashes don’t match on-chain commitmentsSolutions :
Ensure consistent serialization (sort JSON keys) 
Use same encoding (UTF-8 for strings) 
Verify byte order for binary data 
 
Problem : Cannot upload proof artifacts to IPFSSolutions :
Check IPFS node connectivity 
Verify API credentials (Infura/Pinata) 
Reduce artifact size if too large 
Use alternative IPFS service 
 
Problem : Attestation takes too long or times outSolutions :
Implement async verification queue 
Use probabilistic sampling for large batches 
Optimize IPFS gateway performance 
Cache frequently accessed proofs 
 
Problem : Transaction reverts with “UnauthorizedDelegate”Solutions :
Verify VERIFIER_ROLE is granted 
Check signer address matches verifier 
Ensure contract is not paused 
 Next Steps