検索

記事
· 2024年7月23日 1m read

Implemented Community Idea about Generative AI for emails in Python Contest

I implemented a Python Flask application for the 2024 Python Contest with a page that provides common form fields for an outgoing email such as the To and CC fields. And it lets you input a message as well as uploading text based attachments.

Then using LlamaIndex in Python, the app analyzes the content you put in and returns to you in a result box if there is anything that should stop you from sending that email.

Take a look at the Github repo here.

3 Comments
ディスカッション (3)3
続けるにはログインするか新規登録を行ってください
お知らせ
· 2024年7月23日

[Video] Organize Your Code with Namespaces

Hi Community,

What are the advantages of using multiple namespaces for your code? Learn some of the benefits in this discussion with @Derek Robinson, Senior Online Course Developer, and @Scott Clark, Implementation Specialist:

Using Multiple Namespaces

In this video, you will hear some potential use cases for creating multiple namespaces.

Ready to try it yourself? See how to set up namespaces and databases in the Management Portal in InterSystems IRIS® data platform (video, 2m).

3 Comments
ディスカッション (3)1
続けるにはログインするか新規登録を行ってください
記事
· 2024年7月23日 4m read

Databricks Station - InterSystems Cloud SQL

 

A Quick Start to InterSystems Cloud SQL Data in Databricks

Up and Running in Databricks against an InterSystmes Cloud SQL consists of four parts.

  • Obtaining Certificate and JDBC Driver for InterSystems IRIS
  • Adding an init script and external library to your Databricks Compute Cluster
  • Getting Data
  • Putting Data

 

Download X.509 Certificate/JDBC Driver from Cloud SQL

Navigate to the overview page of your deployment, if you do not have external connections enabled, do so and download your certificate and the jdbc driver from the overview page.

 

I have used intersystems-jdbc-3.8.4.jar and intersystems-jdbc-3.7.1.jar with success in Databricks from Driver Distribution.

Init Script for your Databricks Cluster

Easiest way to import one or more custom CA certificates to your Databricks Cluster, you can create an init script that adds the entire CA certificate chain to both the Linux SSL and Java default cert stores, and sets the REQUESTS_CA_BUNDLE property. Paste the contents of your downloaded X.509 certificate in the top block of the following script:

import_cloudsql_certficiate.sh
#!/bin/bash

cat << 'EOF' > /usr/local/share/ca-certificates/cloudsql.crt
-----BEGIN CERTIFICATE-----
<PASTE>
-----END CERTIFICATE-----
EOF

update-ca-certificates

PEM_FILE="/etc/ssl/certs/cloudsql.pem"
PASSWORD="changeit"
JAVA_HOME=$(readlink -f /usr/bin/java | sed "s:bin/java::")
KEYSTORE="$JAVA_HOME/lib/security/cacerts"
CERTS=$(grep 'END CERTIFICATE' $PEM_FILE| wc -l)

# To process multiple certs with keytool, you need to extract
# each one from the PEM file and import it into the Java KeyStore.
for N in $(seq 0 $(($CERTS - 1))); do
  ALIAS="$(basename $PEM_FILE)-$N"
  echo "Adding to keystore with alias:$ALIAS"
  cat $PEM_FILE |
    awk "n==$N { print }; /END CERTIFICATE/ { n++ }" |
    keytool -noprompt -import -trustcacerts \
            -alias $ALIAS -keystore $KEYSTORE -storepass $PASSWORD
done
echo "export REQUESTS_CA_BUNDLE=/etc/ssl/certs/ca-certificates.crt" >> /databricks/spark/conf/spark-env.sh
echo "export SSL_CERT_FILE=/etc/ssl/certs/ca-certificates.crt" >> /databricks/spark/conf/spark-env.sh

Now that you have the init script, upload the script to Unity Catalog to a Volume.

Once the script is on a volume, you can add the init script to the cluster from the volume in the Advanced Properties of your cluster.


Secondly, add the intersystems jdbc driver/library to the cluster...

...and either start or restart your compute.

Databricks Station - Inbound InterSystems IRIS Cloud SQL

 

Create a Python Notebook in your workspace, attach it to your cluster and test dragging data inbound to Databricks.  Under the hood, Databricks is going to be using pySpark, if that is not immediately obvious.

The following spark dataframe construction is all you should need, you can grab your connection info from the overview page as before.

df = (spark.read
  .format("jdbc")
  .option("url", "jdbc:IRIS://k8s-05868f04-a4909631-ac5e3e28ef-6d9f5cd5b3f7f100.elb.us-east-1.amazonaws.com:443/USER")
  .option("driver", "com.intersystems.jdbc.IRISDriver")
  .option("dbtable", "(SELECT name,category,review_point FROM SQLUser.scotch_reviews) AS temp_table;") 
  .option("user", "SQLAdmin")
  .option("password", "REDACTED")
  .option("driver", "com.intersystems.jdbc.IRISDriver")\
  .option("connection security level","10")\
  .option("sslConnection","true")\
  .load())

df.show()

Illustrating the dataframe output from data in Cloud SQL... boom!

Databricks Station - Outbound InterSystems IRIS Cloud SQL

 

Lets now take what we read from IRIS and write it write back with Databricks. If you recall we read only 3 fields into our dataframe, so lets write that back immediately and specify an "overwrite" mode.

df = (spark.read
  .format("jdbc")
  .option("url", "jdbc:IRIS://k8s-05868f04-a4909631-ac5e3e28ef-6d9f5cd5b3f7f100.elb.us-east-1.amazonaws.com:443/USER")
  .option("driver", "com.intersystems.jdbc.IRISDriver")
  .option("dbtable", "(SELECT TOP 3 name,category,review_point FROM SQLUser.scotch_reviews) AS temp_table;") 
  .option("user", "SQLAdmin")
  .option("password", "REDACTED")
  .option("driver", "com.intersystems.jdbc.IRISDriver")\
  .option("connection security level","10")\
  .option("sslConnection","true")\
  .load())

df.show()

mode = "overwrite"
properties = {
    "user": "SQLAdmin",
    "password": "REDACTED",
    "driver": "com.intersystems.jdbc.IRISDriver",
    "sslConnection": "true",
    "connection security level": "10",
}

df.write.jdbc(url="jdbc:IRIS://k8s-05868f04-a4909631-ac5e3e28ef-6d9f5cd5b3f7f100.elb.us-east-1.amazonaws.com:443/USER", table="databricks_scotch_reviews", mode=mode, properties=properties)

Executing the Notebook

 
Illustrating the data in InterSystems Cloud SQL!

Things to Consider

  • By default, PySpark writes data using multiple concurrent tasks, which can result in partial writes if one of the tasks fails.
  • To ensure that the write operation is atomic and consistent, you can configure PySpark to write data using a single task (i.e., set the number of partitions to 1) or use a iris-specific feature like transactions.
  • Additionally, you can use PySpark’s DataFrame API to perform filtering and aggregation operations before reading the data from the database, which can reduce the amount of data that needs to be transferred over the network.
2 Comments
ディスカッション (2)2
続けるにはログインするか新規登録を行ってください
ダイジェスト
· 2024年7月23日

欢迎参加InterSystems 2024 Python编程大赛!

嗨,我们很高兴邀请您参加新的以 Python 为主题的 InterSystems 在线编程竞赛!

🏆 InterSystems 2024 Python 编程大赛 🏆

时间: 2024年7月15日-8月4日(美国东部时间)

奖金池: 14,000美元


话题

使用Python(或嵌入式Python)作为编程语言的InterSystems IRIS或InterSystems IRIS for Health或IRIS Cloud SQL开发任意解决方案。

一般要求:

  1. 应用程序或库必须具有完整的功能。它不应该是另一种语言中已经存在的库的导入或直接接口(C++除外,在C++中,您确实需要做大量的工作来为IRIS创建接口)。它不应是现有应用程序或库的复制粘贴。
  2. 有效应用程序:100%全新的Open Exchange  Apps或已有的应用程序(但有显著提升)。所有参赛者/团队提交的应用程序只有经过我们团队的审核之后才会被批准参赛。
  3. 该应用程序应在 IRIS Community Edition 或 IRIS for Health Community Edition 上运行。两者都可作为host (Mac, Windows)版从Evaluation site下载,或者可以按从 InterSystems Container Registry或Community Container中提取的容器形式使用: intersystemsdc/iris-community:latest 或 intersystemsdc/irishealth-community:latest 。  
  4. 该应用需开源并在GitHub上发布。
  5. 该应用的README文件应为英文,包含安装步骤,并包含视频demo或/和应用程序如何运行的描述。
  6. 一名开发者只允许提交 3 份作品。

请留意:我们的专家将根据复杂性和实用性的标准最终决定应用程序是否被批准参加比赛。他们的决定是最终决定,不得申诉。

奖品

1. 专家提名奖(Experts Nomination)——获奖者由我们特别挑选的专家团选出:

 

🥇第一名 - 5,000 美元

🥈第二名 - 3,000 美元

🥉第三名 - 1,500 美元

🏅第四名 - 750 美元

🏅第五名 - 500 美元

🌟第 6-10 位 - 100 美元

2. 社区提名奖(Community Nomination)—— 获得总票数最多的应用程序:

 

🥇第一名 - 1,000 美元

🥈第二名 - 750 美元

🥉第三名 - 500 美元

🏅第四名 - 300 美元

🏅第五名 - 200 美元

如果几位参与者获得相同数量的选票,他们都将被视为获胜者,奖金由获胜者分享。

谁可以参加?

任何开发者社区的成员均可参加,InterSystems内部员工除外(InterSystems contractor员工可以参加)。还没有账号?现在来建一个!     

👥开发人员可以组队创建协作应用程序。一个团队允许 2 到 5 名开发人员。

请注意,要在您的README文件中标注您的团队成员(成员的社区用户主页)

重要截止日期:

🛠 应用程序开发和注册阶段:

  • 2024年7月15日 (美国东部时间00:00):比赛开始。
  • 2024年7月28日 (美国东部时间23:59):提交截止日期。

 投票时间:

  • 2024年7月29日 (美国东部时间00:00):投票开始。
  • 2024年8月4日 (美国东部时间23:59):投票结束。

注意:在整个参赛期间(开发与投票期间),开发者可持续编辑、提升其应用。

      资源助力:

      ✓ 文档

      ✓ 示例应用及库

      ✓ 在线课程

      ✓ 视频

      ✓ IRIS初学者

      ✓  ObjectScript Package Manager (IPM) 初学者

      ✓ 如果向大赛提交您的应用?

      需要帮助?

      加入 InterSystems Discord server上的竞赛频道或跟帖留言。

      期待您的精彩提交 - 加入我们的编程大赛吧!来赢得胜利!


      ❗️参加本次比赛即表示您同意此处列出的比赛条款。请在继续之前仔细阅读它们。 ❗️

      質問
      · 2024年7月23日

      JWT and CORS

      Hi

      I'm trying to use JWT authentication on a REST application in IRIS. The login API are correctly "injected" into the application. Login works fine with Postman and other REST clients, and subsequent calls to my REST API using the bearer token work fine (correctly authenticated). So far, so good.

      The problem is that it doesn't work with Axios, so I can only test it, I cannot integrate it into my application. I found out the reason for this is that Axios is applying CORS whereas Postman and other REST test clients do not; that is, they don't send the "preflight" OPTIONS request, they send the POST request directly, and apparently IRIS is happy not having to deal with CORS in this case (in other words, it does not check the "allow origin" header, probably because there was no OPTIONS call?) Unfortunately the reverse is not true. Axios sends the OPTIONS preflight, to which IRIS responds with a 500 internal server error and no "Access-Control-Allow-Origin" response header. Axios still attempts to send the POST, but it is getting a NS_ERROR_DOM_BAD_URI error because it failed CORS validation.

      As far as I can tell based on the CORS specification, Axios is right to apply CORS in this case, as this query does not (and CAN not) match the criteria for not using CORS (as described on MDN: https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS). There does not seem to be a way to force Axios not to send the pre-flight when the situation is legitimate for CORS.

      I have correctly configured the dispatch class to handle CORS; without JWT authentication, CORS is handled correctly. However the dispatch class is never invoked for the JWT authentication login API (not even OnPreHTTP). Apparently the CSP engine is handling the login API on its own and never delegates to the dispatch class. So as far as I can tell, there's no way to intercept the OPTIONS call to somehow ignore or accept it.

      Is there a way to make this work in IRIS? Or conversely, does anyone knows of a (minimally invasive) way to force Axios to bypass the pre-flight for this specific call?

      Thanks

      9 Comments
      ディスカッション (9)2
      続けるにはログインするか新規登録を行ってください