新しい投稿

查找

お知らせ
· 2026年2月18日

Beta Testers Needed for our Upcoming InterSystems ObjectScript Specialist Certification Exam

Hello DC community, 

InterSystems Certification is currently developing a certification exam for ObjectScript developers, and if you match the exam candidate description below, we would like you to beta test the exam! The exam will be available for beta testing starting February 18th, 2026. 

Beta testing will be completed May 4, 2026.

What are my responsibilities as a beta tester?

As a beta tester, we ask that you schedule and take the exam by May 4, 2026. The exam will be administered in an online proctored environment free of charge (the standard fee of $150 per exam is waived for all beta testers). The InterSystems Certification team will then perform a careful statistical analysis of all beta test data to set a passing score for the exam. The analysis of the beta test results typically takes 6-8 weeks, and once the passing score is established, you will receive an email notification from InterSystems Certification informing you of the results. If your score on the exam is at or above the passing score, you will have earned the certification!

Note: Beta test scores are completely confidential. 

Interested in participating? Read the Exam Details below. 

Exam Details

Exam title: InterSystems ObjectScript Specialist

Candidate description: An IT professional who:

  • is familiar with object-oriented programming concepts, and
  • uses InterSystems ObjectScript programming language, including objects and SQL, for data access.

Recommended practical experience: At least 6 months - 1 year experience writing and executing ObjectScript code using InterSystems technologies.

Recommended Preparation: Review the following:

Classroom Training

Online Learning

Documentation

Recommended practical experience

At least 6 months of experience developing with InterSystems IRIS using ObjectScript is recommended. 

Exam practice questions 

set of practice questions is provided here to help familiarize candidates with question formats and approaches. 

Exam format 

The questions are presented in two formats: multiple choice and multiple response. Access to InterSystems IRIS Documentation will be available during the exam.  

DISCLAIMER: Please note this exam has a 2-hour time limit. While InterSystems documentation will be available during the exam, candidates will not have time to search the documentation for every question. Thus, completing the recommended preparation before taking the exam, and searching the documentation only when absolutely necessary during the exam, are both strongly encouraged! 

System requirements for beta testing

  • Working camera & microphone
  • Dual-core CPU
  • At least 2 GB available of RAM memory
  • At least 500 MB of available disk space
  • Minimum internet speed:
    • Download - 500kb/s
    • Upload - 500kb/s

Exam topics and content

The exam contains questions that cover the areas for the stated role as shown in the exam topics chart immediately below:

Manages Data Model (23) Applies basic programming concepts to InterSystems ObjectScript and SQL (13) Identifies and leverages features unique to InterSystems IRIS (14) Uses ObjectScript functions and APIs for common operations (26) Handles and resolves errors in InterSystems IRIS (12)

1.1 Uses classes

  1. Identifies use cases for persistent and registered object classes
  2. Creates and saves a persistent object
  3. Deletes objects
  4. Interprets storage definitions
  5. Implements multiple inheritance
  6. Documents classes

1.2 Creates properties, indexes, and other class members

  1. Sets max length for string properties
  2. Uses stream properties for large data sets
  3. Creates properties that calculate values dynamically or are auto-updated (e.g., a timestamp for last updated)
  4. Creates and validates class member parameters and attributes
  5. Selects appropriate index type based on data distribution
  6. Uses unique index methods
  7. Recalls how foreign keys enforce referential integrity

1.3 Creates ObjectScript methods

  1. Differentiates between instance and class methods
  2. Uses class parameters inside methods
  3. Specifies method arguments and return type
  4. Passes objects to methods
  5. Passes variables by reference
  6. Passes multidimensional variables by reference
  7. Uses and overrides inherited methods
  8. Determines when to use ##super for calling superclass methods

1.4 Uses complex structures 

  1. Creates dynamic objects/arrays (JSON) 
  2. Uses stream objects of the appropriate type

2.1 Ensures data integrity

  1. Manages transactions
  2. Manages rollbacks
  3. Describes how LOCKs enforce concurrency
  4. Describes lock escalation threshold and effect on row locks versus table locks
  5. Differentiates between pessimistic and optimistic concurrency controls
  6. Uses transactions and applies concurrency controls in SQL scripts

2.2 Tracks application data

  1. Locates and accesses application globals
  2. Uses logging to track application data
  3. Adds and tracks metrics for performance monitoring

2.3 Implements security features when writing code

  1. Ensures appropriate variable and global use to avoid security leaks
  2. Checks roles for permission control
  3. Prevents SQL injection attacks
  4. Implements embedded SQL permission checks

3.1 Differentiates between different storage media in InterSystems IRIS

  1. Differentiates between PPGs, variables, temporary globals, and globals

3.2 Leverages InterSystems ObjectScript and SQL features

  1. Identifies ObjectScript as a weakly typed language and contrasts its usage with strongly typed languages
  2. Uses system macros and include files
  3. Describes how object structures are projected to SQL tables
  4. Differentiates between Embedded and Dynamic SQL
  5. Differentiates between runtime and select modes, and enforces the correct mode
  6. Uses SQL variables such as ROWID and SQLCODE

3.3 Handles nulls

  1. Manages $C(0) in SQL and ObjectScript

3.4 Handles schema evolution

  1. Adds indexes to existing properties
  2. Describes the impact of changing the name and datatype of a property with existing data
  3. Describes consequences of purging cached queries after schema changes

3.5 Ensures scalability and performance

  1. Uses TUNE TABLE to optimize performance
  2. Interprets basic query plans
  3. Tests code correctness and performance in non-functional tests

4.1 Traverses and sorts arrays

  1. Describes how subscripts are sorted within an array
  2. Traverses subset of a subscript using $ORDER
  3. Traverse multi-level variables using $ORDER
  4. Checks the existence of array nodes with $DATA

4.2 Manipulates and processes lists

  1. Uses $LIST to insert/update/delete elements in a list
  2. Retrieves and iterates through elements in a list using $LISTGET and $LISTNEXT
  3. Converts between lists and strings

4.3 Manipulates strings

  1. Uses $PIECE to extract and manipulate delimited strings
  2. Uses $EXTRACT to retrieve substrings
  3. Uses $REPLACE, $TRANSLATE, and $ZSTRIP to manipulate strings
  4. Interpolates strings
  5. Uses regular expressions to search and replace patterns in strings
  6. Uses $ZCONVERT to escape and encode strings
  7. Recalls how to escape special characters within strings

4.4 Performs mathematical, logical, date, and time operations

  1. Uses mathematical operators 
  2. Uses logical operators
  3. Formats dates and times (e.g., uses $ZDATE$ZTIME, and $HOROLOG)
  4. Performs arithmetic operations on date/time values

4.5 Uses decision and control structures

  1. Uses post-conditionals to control which commands are executed
  2. Distinguishes between the quit and return commands
  3. Identifies how expressions are evaluated in Boolean contexts

4.6 Executes and queries methods and objects

  1. Executes methods with $METHOD and $CLASSMETHOD
  2. Uses %IsA and %ClassName to obtain information about existing objects
  3. Uses %Dictionary to inspect class definitions

4.7 Uses APIs for common operations

  1. Reads and writes files with %Stream package
  2. Uses %Net to make HTTP requests, transfer files securely, and send emails

5.1 Uses InterSystems IRIS supported troubleshooting tools

  1. Uses tools provided in InterSystems IRIS for monitoring code performance
  2. Interprets class compilation errors 

5.2 Handles and logs runtime errors

  1. Uses TRY-CATCH to handle runtime errors
  2. Throws and handles exceptions in ObjectScript
  3. Reviews application error log for runtime failures
  4. Uses $STACK to analyze and trace runtime errors
  5. Converts error status codes to readable messages
  6. Differentiates between statuses and exceptions when troubleshooting

5.3 Diagnoses and debugs common runtime errors

  1. Diagnoses and debugs <SUBSCRIPT> errors
  2. Diagnoses and debugs <PROTECT> errors
  3. Diagnoses and debugs <FRAMESTACK> errors 
  4. Diagnoses and debugs <UNDEFINED> errors

Instructions: 

Please review the following instructions for scheduling and buying an exam:

  1. From our exam store, log in with your InterSystems Single Sign-On (SSO) account.
    1. If necessary, please register for an account.
  2. Select InterSystems ObjectScript Specialist - Beta (IOS-Beta) and click Get Started.
  3. Verify system compatibility as instructed. The Safe Exam Browser download requires administrative privileges on your device.
  4. Run the setup test to ensure the device satisfies the exam requirements.
  5. Schedule your exam – this must be done before checking out. The exam must be taken at least 24 hours after, but within 30 days, of scheduling the exam.
  6. Review the InterSystems Certification Program Agreement.
  7. Confirm your appointment. You will receive an email from Certiverse with your exam appointment details.
  8. You can access your reservations and history through the Exam Dashboard available through the MY EXAMS menu.

Below are important considerations that we recommend to optimize your testing experience:

  • Read the Taking InterSystems Exams and Exam FAQs pages to learn about the test-taking experience.
  • Read the InterSystems Certification Exam Policies.
  • On the day of your exam, log in to Certiverse at least 10 minutes before your scheduled time, launch the exam under MY EXAMS, and wait for the proctor to connect.
  • Please have your valid government ID ready for identification. The proctor will walk you through the process of securing your room and releasing the exam to you. 

You may cancel or reschedule your appointment without penalty as long as the action is taken at least 24 hours in advance of your appointment. The voucher code will reactivate and you can use it to reschedule the exam.

Please contact certification@intersystems.com if you have any questions or need assistance, and we encourage you to share any feedback about the exam, whether positive or negative.

1件の新着コメント
ディスカッション (1)2
続けるにはログインするか新規登録を行ってください
お知らせ
· 2026年2月18日

Meet Ashok Kumar - New Developer Community Moderator!

Hi Community,

Please welcome @Ashok Kumar T as our new Moderator in the Developer Community Team! 🎉

Let's greet Ashok with a round of applause and look at his bio!

@Ashok Kumar T is a Senior Software Engineer. 

A few words from Ashok: 

I'm a senior Software Engineer with over a decade of experience specializing in the InterSystems technology stack. Since 2014, my focus has been on leveraging the full power of the InterSystems ecosystem to solve complex data and integration challenges. I bring a deep understanding of both ObjectScript and modern IRIS implementations.

My professional philosophy is rooted in a commitment to core values: a constant willingness to learn and a proactive approach to sharing knowledge.

WARM WELCOME!

Thank you and congratulations, @Ashok Kumar T 👏

We're glad to have you on our moderators' team!

10件の新着コメント
ディスカッション (10)5
続けるにはログインするか新規登録を行ってください
記事
· 2026年2月18日 6m read

pyprod: Pure Python IRIS Interoperability

Intersystems IRIS Productions provide a powerful framework for connecting disparate systems across various protocols and message formats in a reliable, observable, and scalable manner. intersystems_pyprod, short for InterSystems Python Productions, is a Python library that enables developers to build these interoperability components entirely in Python. Designed for flexibility, it supports a hybrid approach: you can seamlessly mix new Python-based components with existing ObjectScript-based ones, leveraging your established IRIS infrastructure. Once defined, these Python components are managed just like any other; they can be added, configured, and connected using the IRIS Production Configuration page. 


A Quick Primer on InterSystems IRIS Productions

Key Elements of a Production

Image from Learning Services training material

An IRIS Production generally receives data from external interfaces, processes it through coordinated steps, and routes it to its destination. As messages move through the system, they are automatically persisted, making the entire flow fully traceable through IRIS’s visual trace and logging tools. The architecture relies on certain key elements:

  1. Business Hosts: These are the core building blocks—Services, Processes, and Operations—that pass persistable messages between one another.
  2. Adapters: Inbound and outbound adapters manage the interaction with the external world, handling the specific protocols needed to receive and send data.
  3. Callbacks: The engine uses specific callback methods to pass messages between hosts, either synchronously or asynchronously. These callbacks follow strict signatures and return a Status object to ensure execution integrity.
  4. Configuration Helpers: Objects such as Properties and Parameters expose settings to the Production Configuration UI, allowing users to easily instantiate, configure, and save the state of these components.

Workflow using pyprod

This is essentially a 3 step process.

  1. Write your production components in a regular Python script. In that script, you import the required base classes from intersystems_pyprod and define your own components by subclassing them, just as you would with any other Python library.
  2. Load them into InterSystems IRIS by running the intersystems_pyprod (same name as the library) command from the terminal and passing it the path to your Python script. This step links the Python classes with IRIS so that they appear as production components and can be configured and wired together using the standard Production Configuration UI. 
  3. Create the production using the Production Configuration page and start the Production

NOTE: If you create all your components with all their Properties hardcoded within the python script, you only need to add them to the production and start the Production. 

You can connect pyprod to your IRIS instance by doing a one time setup


Simple Example

In this example, we demonstrate a synchronous message flow where a request originates from a Service, moves through a Process, and is forwarded to an Operation. The resulting response then travels the same path in reverse, passing from the Operation back through the Process to the Service. Additionally, we showcase how to utilize the IRISLog utility to write custom log entries.

Step 1

Create your Production components using pyprod in the file HelloWorld.py

Here are some key parts of the code

  • Package Naming: We define iris_package_name, which prefixes all classes as they appear on the Production Configuration page (If omitted, the script name is used as the default prefix).
  • Persistable Messages: We define MyRequest and MyResponse. These are the essential data structures for communication, as only persistable objects can be passed between Services, Processes, and Operations.
  • The Inbound Adapter: Our adapter passes a string to the Service using the business_host_process_input method.
  • The Business Service: Implemented with the help of OnProcessInput callback.
    • MyService receives data from the adapter and converts it into a MyRequest message
    • We use the ADAPTER IRISParameter to link the Inbound Adapter to the Service. Note that this attribute must be named ADAPTER in all caps to align with IRIS conventions.
    • We define a target IRISProperty, which allows users to select the destination component directly via the Configuration UI.
  • The Business Process: Implemented with the help of OnRequest callback.
  • The Business Operation: Implemented with the help of OnMessage callback. (You can also define a MessageMap)
  • Logic & Callbacks: Finally, the hosts implement their core logic within standard callbacks like OnProcessInput and OnRequest, routing messages using the SendRequestSync method.

You can read more about each of these parts on the pyprod API Reference page and also using the Quick Start Guide.

import time

from intersystems_pyprod import (
    InboundAdapter,BusinessService, BusinessProcess, 
    BusinessOperation, OutboundAdapter, JsonSerialize, 
    IRISProperty, IRISParameter, IRISLog, Status)

iris_package_name = "helloworld"
class MyRequest(JsonSerialize):
    content: str

class MyResponse(JsonSerialize):
    content: str

class MyInAdapter(InboundAdapter):
    def OnTask(self):
        time.sleep(0.5)
        self.business_host_process_input("request message")
        return Status.OK()

class MyService(BusinessService):
    ADAPTER = IRISParameter("helloworld.MyInAdapter")
    target = IRISProperty(settings="Target")
    def OnProcessInput(self, input):
        persistent_message = MyRequest(input)
        status, response = self.SendRequestSync(self.target, persistent_message)
        IRISLog.Info(response.content)
        return status

class MyProcess(BusinessProcess):
    target = IRISProperty(settings="Target")
    def on_request(self, input):
        status, response = self.SendRequestSync(self.target,input)
        return status, response


class MyOperation(BusinessOperation):
    ADAPTER = IRISParameter("helloworld.MyOutAdapter")
    def OnMessage(self, input):
        status = self.ADAPTER.custom_method(input)
        response = MyResponse("response message")
        return status, response


class MyOutAdapter(OutboundAdapter):
    def custom_method(self, input):
        IRISLog.Info(input.content)
        return Status.OK()

 

Step 2

Once your code is ready, load the components to IRIS.

$ intersystems_pyprod /full/path/to/HelloWorld.py

    Loading MyRequest to IRIS...
    ...
    Load finished successfully.
    
    Loading MyResponse to IRIS...
    ...
    Load finished successfully.
    ...
    

Step 3

Add each host to the Production using the Production Configuration page.

The image below shows MyService and its target property being configured through the UI. Follow the same process to add MyProcess and MyOperation. Once the setup is complete, simply start the production to see your messages in motion.


Final Thoughts

By combining the flexibility of the Python ecosystem with the industrial-grade reliability of InterSystems IRIS, pyprod offers a modern path for building interoperability solutions. Whether you are developing entirely new "Pure Python" productions or enhancing existing ObjectScript infrastructures with specialized Python libraries, pyprod ensures your components remain fully integrated, observable, and easy to configure. We look forward to seeing what you build!


Quick Links

GitHub repository  

PyPi Package

Support the Project: If you find this library useful, please consider giving us a ⭐ on GitHub and suggesting enhancements. It helps the project grow and makes it easier for other developers in the InterSystems community to discover it!
ディスカッション (0)1
続けるにはログインするか新規登録を行ってください
記事
· 2026年2月18日 7m read

Cómo añadir fácilmente una validación contra las especificaciones OpenAPI a vuestras APIs REST

En este artículo, pretendo demostrar un par de métodos para añadir fácilmente validación a las APIs REST en InterSystems IRIS Data Platform. Creo que un enfoque specification-first es una idea excelente para el desarrollo de APIs. IRIS ya dispone de funcionalidades para generar un esqueleto de implementación a partir de una especificación y publicar esa especificación para desarrolladores externos (usadlo junto con iris-web-swagger-ui para obtener los mejores resultados). Lo único importante que aún no está implementado en la plataforma es el validador de solicitudes. ¡Vamos a solucionarlo!

La tarea es la siguiente: todas las solicitudes entrantes deben validarse contra el esquema de la API descrito en formato OpenAPI. Como sabéis, la solicitud contiene: método (GET, POST, etc.), URL con parámetros, cabeceras (Content-Type, por ejemplo) y cuerpo (algún JSON). Todo ello puede comprobarse. Para resolver esta tarea, utilizaré Embedded Python, ya que la amplia biblioteca de código abierto en Python ya cuenta con dos proyectos adecuados: openapi-core y openapi-schema-validator. Una limitación aquí es que IRIS está utilizando Swagger 2.0, una versión obsoleta de OpenAPI. La mayoría de las herramientas no son compatibles con esta versión, por lo que la primera implementación de nuestro validador se limitará a comprobar únicamente el cuerpo de la solicitud.

Solución basada en openapi-schema-validator

Entradas clave:

  • La solución es totalmente compatible con el enfoque specification-first recomendado por InterSystems para el desarrollo de APIs. No necesitáis modificar las clases de la API generadas, salvo un pequeño detalle, del que hablaré más adelante.
  • Solo se valida el cuerpo de la solicitud.
  • Necesitamos extraer la definición del tipo de solicitud desde la especificación OpenAPI (clase spec.cls).
  • La correspondencia entre el JSON de la solicitud y la definición de la especificación se realiza estableciendo un tipo de contenido específico del proveedor.

Primero, necesitáis establecer un tipo de contenido específico del proveedor en la propiedad consumes de la especificación OpenAPI para vuestro endpoint. Debe tener un formato parecido a este: vnd.<company>.<project>.<api>.<request_type>+json. Por ejemplo, yo usaré:

"paths":{
      "post":{
        "consumes":[
          "application/vnd.validator.sample_api.test_post_req+json"
        ],
...

A continuación, necesitamos una clase base para nuestra clase de dispatch. Aquí está el código completo de esta clase; el código también está disponible en Git.

Class SwaggerValidator.Core.REST Extends %CSP.REST
{

Parameter UseSession As Integer = 1;
ClassMethod OnPreDispatch(pUrl As %String, pMethod As %String, ByRef pContinue As %Boolean) As %Status
{
	Set tSC = ..ValidateRequest()
    
    If $$$ISERR(tSC) {
        Do ..ReportHttpStatusCode(##class(%CSP.REST).#HTTP400BADREQUEST, tSC)
        Set pContinue = 0
    }

    Return $$$OK
}

ClassMethod ValidateRequest() As %Status
{
    Set tSC = ##class(%REST.API).GetApplication($REPLACE($CLASSNAME(),".disp",""), .spec)
    Return:$$$ISERR(tSC) tSC

    Set defName = $PIECE($PIECE(%request.ContentType, "+", 1), ".", *)
    Return:defName="" $$$ERROR($$$GeneralError, $$$FormatText("No definition name found in Content-Type = %1", %request.ContentType))
    
    Set type = spec.definitions.%Get(defName)
    Return:type="" $$$ERROR($$$GeneralError, $$$FormatText("No definition found in specification by name = %1", defName))
    
    Set schema = type.%ToJSON() 
    Set body = %request.Content.Read()

    Try {Set tSC = ..ValidateImpl(schema, body)} Catch ex {Set tSC = ex.AsStatus()}

    Return tSC
}

ClassMethod ValidateImpl(schema As %String, body As %String) As %Status [ Language = python ]
{
    try:
        validate(json.loads(body), json.loads(schema))
    except Exception as e:
        return iris.system.Status.Error(5001, f"Request body is invalid: {e}")

    return iris.system.Status.OK()
}

XData %import [ MimeType = application/python ]
{
import iris, json
from openapi_schema_validator import validate
}

}

Aquí estamos haciendo las siguientes cosas:

  1. Se sobrescribe OnPreDispatch() para añadir la validación. Este código se ejecutará en cada llamada a nuestra API.
  2. Se utiliza##class(%REST.API).GetApplication()para obtener la especificación en un objeto dinámico (JSON).
  3. Se extrae el nombre de la definición desde la cabecera Content-Type.
  4. Se obtiene el esquema de la solicitud mediante el nombre de la definición: spec.definitions.%Get(defName)
  5. Se envían el esquema de la solicitud y el cuerpo de la solicitud al código Python para su validación.

Como veis, todo es bastante sencillo. Ahora solo necesitáis cambiar la sección Extends de vuestra disp.clsa SwaggerValidator.Core.REST. Y, por supuesto, instalar la librería Python openapi-schema-validatoren el servidor (tal como se describe aquí).

Solución basada en openapi-core

Entradas clave:

  • Esta solución funciona con una interfaz REST codificada a mano. No usamos herramientas de API Management para generar el código a partir de la especificación OpenAPI. Solo tenemos un servicio REST como subclase de %CSP.REST.
  • Por lo tanto, no estamos limitados a la versión 2.0/JSON y utilizaremos OpenAPI 3.0 en formato YAML. Esta versión ofrece más posibilidades, y encuentro que YAML es más legible.
  • Se comprobarán los siguientes elementos: parámetros de ruta y consulta en la URL, Content-Type y cuerpo de la solicitud.

Para empezar, tomemos nuestra especificación ubicada en <servidor>/api/mgmnt/v1/<namespace>/spec/<aplicación-web>. Sí, tenemos una especificación OpenAPI generada incluso para APIs REST codificadas manualmente. Esta no es una especificación completa porque no contiene los esquemas de solicitudes y respuestas (el generador no sabe de dónde obtenerlos). Pero la plataforma ya ha hecho la mitad del trabajo por nosotros. Coloquemos esta especificación en un bloque XData llamado OpenAPI   en la clase Spec.cls A continuación, necesitamos convertir la especificación a formato OpenAPI 3.0/YAML y añadir definiciones para solicitudes y respuestas. Podéis usar un convertidor o simplemente pedirlo a Codex:

Por favor, convertid la especificación de la clase @Spec.clsa la versión Swagger 3.0 y al formato YAML.

De la misma manera, podemos pedir a Codex que genere los esquemas de solicitudes/respuestas basándose en ejemplos JSON.

Por cierto, el vibe coding funciona bastante bien en el desarrollo con IRIS, pero eso es un tema para otra ocasión. ¡Decidme si os resulta interesante!

Como en la solución anterior, debemos crear una clase base para nuestro %CSP.REST. Esta clase es muy similar:

Class SwaggerValidator.Core.RESTv2 Extends %CSP.REST
{

Parameter UseSession As Integer = 1;
ClassMethod OnPreDispatch(pUrl As %String, pMethod As %String, ByRef pContinue As %Boolean) As %Status
{
	Set tSC = ..ValidateRequest()
    
    If $$$ISERR(tSC) {
        Do ..ReportHttpStatusCode(##class(%CSP.REST).#HTTP400BADREQUEST, tSC)
        Set pContinue = 0
    }

    Return $$$OK
}

ClassMethod ValidateRequest() As %Status
{
    Set tSC = ..GetSpec(.swagger) 
    Return:$$$ISERR(tSC)||(swagger="") tSC

    Set canonicalURI = %request.CgiEnvs("REQUEST_SCHEME")_"://"_%request.CgiEnvs("HTTP_HOST")_%request.CgiEnvs("REQUEST_URI")
    Set httpBody = $SELECT($ISOBJECT(%request.Content)&&(%request.Content.Size>0):%request.Content.Read(), 1:"")
    Set httpMethod = %request.CgiEnvs("REQUEST_METHOD")
    Set httpContentType = %request.ContentType
    Try {
        Set tSC = ..ValidateImpl(swagger, canonicalURI, httpMethod, httpBody, httpContentType)
    } Catch ex {
        Set tSC = ex.AsStatus()
    }

    Return tSC
}

/// The class Spec.cls must be located in the same package as the %CSP.REST implementation
/// The class Spec.cls must contain an XData block named 'OpenAPI' with swagger 3.0 specification (in YAML format) 
ClassMethod GetSpec(Output specification As %String, xdataName As %String = "OpenAPI") As %Status
{
    Set specification = ""
    Set specClassName = $CLASSNAME()
    Set $PIECE(specClassName, ".", *) = "Spec"
    Return:'##class(%Dictionary.ClassDefinition).%Exists($LISTBUILD(specClassName)) $$$OK
    Set xdata = ##class(%Dictionary.XDataDefinition).%OpenId(specClassName_"||"_xdataName,,.tSC)
    If $$$ISOK(tSC),'$ISOBJECT(xdata)||'$ISOBJECT(xdata.Data)||(xdata.Data.Size=0) {
		Set tSC = $$$ERROR($$$RESTNoRESTSpec, xdataName, specClassName)
	}
    Return:$$$ISERR(tSC) tSC
    
    Set specification = xdata.Data.Read()
    Return tSC
}

ClassMethod ValidateImpl(swagger As %String, url As %String, method As %String, body As %String, contentType As %String) As %Status [ Language = python ]
{
    spec = Spec.from_dict(yaml.safe_load(swagger))
    data = json.loads(body) if (body != "") else None
    headers = {"Content-Type": contentType}
    
    req = requests.Request(method=method, url=url, json=data, headers=headers).prepare()
    openapi_req = RequestsOpenAPIRequest(req)

    try:
        validate_request(openapi_req, spec=spec)
    except Exception as ex:
        return iris.system.Status.Error(5001, f"Request validation failed: {ex.__cause__ if ex.__cause__ else ex}")

    return iris.system.Status.OK()
}

XData %import [ MimeType = application/python ]
{
import iris, json, requests, yaml
from openapi_core import Spec, validate_request
from openapi_core.contrib.requests import RequestsOpenAPIRequest
}

}

A tener en cuenta: una clase que contenga la especificación debe llamarse Spec.cls y estar ubicada en el mismo paquete que vuestra implementación %CSP.REST. La clase de especificación se ve así:

Class Sample.API.Spec Extends %RegisteredObject
{

XData OpenAPI [ MimeType = application/yaml ]
{
    ... your YAML specification ...
}
}

Para habilitar la validación, solo necesitáis extender vuestra clase de API heredando de SwaggerValidator.Core.RESTv2 y colocar el archivo Spec.cls junto a ella.

Eso es todo lo que quería contaros sobre la validación con Swagger. No dudéis en hacerme preguntas.

ディスカッション (0)1
続けるにはログインするか新規登録を行ってください
記事
· 2026年2月18日 9m read

Best Practices for Integrating AI with InterSystems for Real-Time Analytics

Your data pipelines are running. Your dashboards are live. But if AI isn't embedded directly into your InterSystems environment, you're leaving the most valuable insights sitting idle—processed too slowly to act on.

Real-time analytics isn't just about speed anymore. It's about intelligence at the point of action. InterSystems IRIS, the company's flagship data platform, is purpose-built to close the gap between raw operational data and AI-driven decisions—without the latency tax of moving data to an external ML system.

 

In this guide, we break down the proven best practices for integrating AI with InterSystems for real-time analytics, covering everything from architecture patterns to model deployment strategies that actually hold up in production.

Quick Overview

Aspect

Details

Platform

InterSystems IRIS Data Platform

Core Capability

Embedded AI/ML with real-time transactional + analytics workloads

Key Use Cases

Healthcare analytics, financial fraud detection, supply chain optimization

AI Integration Methods

IntegratedML, Python Gateway, PMML, REST API endpoints

Target Users

Data engineers, AI architects, enterprise developers

 

Why InterSystems IRIS for AI-Driven Real-Time Analytics

Most enterprise analytics architectures suffer from a common architectural flaw: the AI layer lives outside the data layer. Data has to travel—from operational databases to data lakes, through transformation pipelines, into ML platforms—before a prediction can be made. By then, the moment has often passed.

InterSystems IRIS takes a fundamentally different approach. It combines transactional processing (OLTP), analytics (OLAP), and AI/ML capabilities in a single, unified platform. This convergence isn't just a convenience—it's a performance breakthrough. According to InterSystems, IRIS can ingest and analyze millions of events per second while simultaneously running machine learning models against that live data.

The result: AI predictions generated in milliseconds, not minutes. For industries where the cost of latency is measured in lives (healthcare) or dollars (financial services), this architecture is a game-changer.

Best Practice #1: Use IntegratedML for In-Database Model Training

IntegratedML is InterSystems' declarative machine learning engine built directly into IRIS SQL. Rather than extracting data to an external Python or R environment, train and deploy models with SQL-style commands inside the database itself.

This approach eliminates the data movement overhead that plagues traditional ML pipelines. A model trained on 10 million patient records doesn't need to be serialized, transferred, and deserialized—it runs where the data lives.

How to Implement

  • Create a model: CREATE MODEL PatientRisk PREDICTING (RiskScore) FROM PatientData
  • Train with a single command: TRAIN MODEL PatientRisk
  • Generate predictions inline: SELECT PREDICT(PatientRisk) AS Risk FROM PatientData WHERE PatientID = 12345

Best practice: Use IntegratedML for structured tabular data where speed-to-deployment matters more than custom model architectures. For deep learning or custom neural networks, leverage Python Gateway instead.

Best Practice #2: Leverage Python Gateway for Advanced ML Frameworks

IntegratedML handles a wide range of classification and regression problems, but enterprise AI often demands more—custom neural networks, NLP pipelines, reinforcement learning, or computer vision models built in TensorFlow, PyTorch, or scikit-learn.

InterSystems Python Gateway solves this by embedding Python execution natively within IRIS. Instead of building a separate microservice to run your Python models, you call them directly from ObjectScript or via SQL stored procedures. The data never leaves the IRIS environment.

Key Implementation Tips

  • Install Python Gateway as a IRIS add-on and configure your Python environment path in the IRIS Management Portal
  • Use the IRISNative API to pass IRIS globals directly into Python objects—eliminating serialization overhead
  • Cache frequently used model objects in memory using IRIS's built-in caching layer to avoid re-loading on every prediction request
  • For high-throughput scenarios, deploy models as persistent Python processes rather than loading them per-request

For organizations looking to accelerate production-grade AI deployment across complex enterprise environments, teams like

For organizations looking to accelerate production-grade AI deployment across complex enterprise environments, teams like Denebrix AI provide structured implementation support that spans model development, IRIS integration, and real-time pipeline validation.

Best Practice #3: Architect for Low-Latency with Adaptive Analytics

Real-time analytics requires more than fast data retrieval—it demands an architecture that adapts to changing data distributions. InterSystems Adaptive Analytics (powered by AtScale) bridges IRIS with BI tools like Tableau, Power BI, and Looker, providing a semantic layer that enables live, in-memory analytical queries without pre-aggregating data into cubes.

The key architectural principle here is pushdown optimization: analytics queries run as close to the data as possible, inside IRIS, rather than pulling raw rows into an external analytics engine. This can reduce query times from minutes to seconds for enterprise-scale datasets.

Architecture Recommendations

  • Define business metrics and KPIs in the Adaptive Analytics semantic layer—not in your BI tool—to ensure consistency across dashboards
  • Use IRIS columnar storage for analytics-heavy tables while keeping transactional tables in row-based storage
  • Implement multi-model data architecture: relational tables for transactions, globals for hierarchical data, and vector tables for similarity search in AI applications
  • Enable streaming analytics via IRIS's built-in message broker to process event streams without leaving the platform

Best Practice #4: Deploy Models with PMML for Portability

Not every AI model will be born inside IRIS. Data scientists often build models in external environments—SageMaker, Azure ML, Google Vertex AI—and need to deploy them into operational systems. InterSystems supports PMML (Predictive Model Markup Language), an open standard for representing trained models.

Importing a PMML model into IRIS means predictions can be generated by the platform without maintaining a live connection to the external ML environment. This is particularly valuable in regulated industries where data residency requirements prevent sending records to cloud inference endpoints.

PMML Deployment Workflow

  • Export trained model as PMML XML from your ML platform of choice
  • Import into IRIS using the DeepSee PMML engine or the InterSystems PMML Utils library
  • Wrap the PMML inference call in an IRIS stored procedure for easy integration with existing application code
  • Monitor prediction drift by logging PMML outputs alongside actuals in an IRIS analytics table

Best Practice #5: Build GenAI Applications with Vector Search

Generative AI is redefining what's possible with enterprise data. InterSystems IRIS now supports vector embeddings natively, enabling semantic search, retrieval-augmented generation (RAG), and similarity-based recommendations directly within the platform—no external vector database required.

This is significant for real-time analytics: imagine a clinical decision support system that retrieves semantically similar patient cases at the moment a physician places an order, or a fraud detection engine that finds transactions matching known fraud patterns using embedding similarity rather than rigid rule matching.

Implementation Blueprint

  • Generate embeddings using models like OpenAI text-embedding-3-small or open-source alternatives (BAAI/bge, sentence-transformers)
  • Store embeddings in IRIS vector-type columns alongside your structured data
  • Use VECTOR_COSINE() or VECTOR_DOT_PRODUCT() SQL functions to run similarity queries inline with your analytics
  • For RAG applications, combine IRIS vector search with an LLM API call, passing retrieved context as part of the prompt

AI Integration Methods: Comparison at a Glance

Method

Best For

Latency

Data Movement

IntegratedML

Tabular classification/regression

Very Low

None

Python Gateway

Custom ML frameworks

Low

None

PMML Import

Pre-trained external models

Low

None

REST API (external)

Large LLMs, cloud models

Medium-High

Yes

Vector Search

Semantic/similarity queries

Very Low

None

 

Best Practice #6: Monitor, Retrain, and Govern AI Models Continuously

Production AI models degrade. Data distributions shift, business rules change, and models that were 95% accurate at deployment can slip to 70% within months. Real-time analytics environments are especially vulnerable because they're ingesting live, unpredictable data.

InterSystems IRIS provides the infrastructure for continuous model monitoring through its analytics and auditing capabilities. Build feedback loops that log predictions, compare them against actuals, and trigger retraining workflows when accuracy falls below defined thresholds.

Governance Checklist

  • Log every model prediction with input features, output score, and timestamp into an IRIS audit table
  • Set up automated drift detection using statistical tests (KS-test, PSI) on incoming feature distributions
  • Define retraining triggers: schedule-based (weekly), performance-based (accuracy < threshold), or event-based (data schema change)
  • Maintain model versioning in IRIS globals for rollback capability
  • Implement role-based access controls (RBAC) on model endpoints to ensure only authorized services can invoke AI predictions

Frequently Asked Questions

What is InterSystems IRIS IntegratedML?

IntegratedML is a declarative machine learning engine embedded directly in InterSystems IRIS. It lets developers train, validate, and deploy predictive models using SQL-like syntax, without moving data to an external ML platform. It's designed to reduce the complexity of bringing AI into production for developers who aren't data scientists.

How does InterSystems IRIS handle real-time AI inference?

IRIS runs AI inference in-process with the operational data, eliminating network round-trips to external ML services. Through IntegratedML, Python Gateway, and native PMML support, predictions are generated as part of SQL queries or application transactions—delivering millisecond latency at enterprise scale.

Can I use Python and TensorFlow with InterSystems IRIS?

Yes. The Python Gateway add-on enables direct Python execution within the IRIS environment. You can use any Python ML library—TensorFlow, PyTorch, scikit-learn, HuggingFace—and call models from ObjectScript or SQL. This allows teams to build models in familiar Python environments and deploy them without a separate inference microservice.

What are the limitations of IntegratedML?

IntegratedML is optimized for structured tabular data and standard ML tasks (classification, regression). It doesn't support custom neural network architectures, unstructured data like images or audio, or advanced techniques such as reinforcement learning. For these use cases, Python Gateway or external model integration via REST or PMML is recommended.

How does InterSystems IRIS support Generative AI applications?

IRIS supports GenAI through native vector storage and similarity search functions, enabling retrieval-augmented generation (RAG) workflows without a separate vector database. Teams can store embeddings alongside structured data, run semantic search queries in SQL, and combine results with external LLM API calls for applications like intelligent document retrieval or clinical decision support.

Is InterSystems IRIS suitable for healthcare AI analytics?

Yes, IRIS is widely adopted in healthcare, with purpose-built products like HealthShare and TrakCare built on the platform. It supports HL7 FHIR natively, provides HIPAA-compliant data handling, and integrates AI capabilities directly with clinical data—making it well-suited for predictive analytics in clinical and operational healthcare settings.

Final Thoughts

Integrating AI with InterSystems for real-time analytics isn't a single decision—it's a series of architectural choices that compound over time. Start with IntegratedML for fast time-to-value on structured prediction tasks. Layer in Python Gateway when your models outgrow declarative SQL. Embrace vector search as GenAI reshapes what enterprise applications can do.

The organizations winning with real-time AI aren't just faster—they're building systems where intelligence is inseparable from operations. InterSystems IRIS gives you the platform to do exactly that. The practices in this guide give you the roadmap.

ディスカッション (0)1
続けるにはログインするか新規登録を行ってください