* fix(python): handle eval when it is a single line execution but there is string input with space (#756)
* Update Dockerfile.tomcat (#757)
* fix: tomcat builder setting env var
* fix: updating tomcat to 9.0.104
* Update Dockerfile.ubuntu22.04
* Update Dockerfile.ubuntu22.04
* Update Dockerfile.ubuntu22.04
* feat: creating KubernetesModelScaler class (#763)
* Update Dockerfile.ubuntu22.04
* feat: adding ability to attach a file to a vector db source (#736)
* Added AttachSourceToVectorDbReactor for uploading pdf file to an existing csv file and modified VectorFileDownloadReactor
* fix: proper return for the download and matching the reactor name
* fix: error for downloading single file vs multiple; error for copyToDirectory instead of copyFile
* chore: renaming so reactor matches VectorFileDownload
---------
Co-authored-by: Maher Khalil <themaherkhalil@gmail.com>
* Update Dockerfile.ubuntu22.04
* Update ubuntu2204.yml
* Update ubuntu2204.yml
* Update ubuntu2204_cuda.yml
* Update Dockerfile.nvidia.cuda.12.5.1.ubuntu22.04
* Update ubuntu2204_cuda.yml
* Update ubuntu2204.yml
* feat: exposing tools calling through models (#764)
* 587 unit test for prernadsutil (#654)
* test(unit): unit tests for the prerna.util.ds package
* test(unit): unit tests for the prerna.util.ds.flatfile package
* test(unit): removed reflections, added paraquet tests
* test(unit): unit tests for the prerna.util.ds package
* test(unit): unit tests for the prerna.util.ds.flatfile package
* test(unit): removed reflections, added paraquet tests
* Update ubuntu2204.yml
* Update ubuntu2204.yml
* Update ubuntu2204.yml
* fix: update pipeline docker buildx version
* fix: ignore buildx
* fix: adjusting pipeline for cuda
* feat: switching dynamic sas to default false (#766)
* fix: changes to account for version 2.0.0 of pyjarowinkler (#769)
* chore: using 'Py' instead of 'py' to be consistent (#770)
* feat: full ast parsing of code to return evaluation of the last expression (#771)
* Python Deterministic Token Trimming for Message Truncation (#765)
* feat: deterministic-token-trimming
* feat: modifying logic such that system prompt is second to last message for truncation
---------
Co-authored-by: Maher Khalil <themaherkhalil@gmail.com>
* fix: added date added column to enginepermission table (#768)
* fix: add docker-in-docker container to run on sef-hosted runner (#773)
Co-authored-by: Raul Esquivel <resmas.work@gmail.com>
* fix: properly passing in the parameters from kwargs/smss into model limits calculation (#774)
* fix: removing legacy param from arguments (#777)
* fix: Fix docker cache build issue (#778)
* adding no cache
* adding no cache
* feat: Adding Semantic Text Splitting & Token Text Splitting (#720)
* [696] - build - Add chonky semantic text splitting - Added the function for chonky semantic text splitting and integrated with existing flow.
* [696] - build - Add chonky semantic text splitting - Updated the code
* [696] - build - Add chonky semantic text splitting - Updated the code comments
* feat: adding reactor support through java
* feat: updating pyproject.toml with chonky package
* feat: check for default chunking method in smss
* [696] - feat - Add chonku semantic text splitting - Resolved the conflicts
* [696] - feat - Add chonky semantic text splitting - Organized the code.
* feat: adding chunking by tokens and setting as default
* updating comments on chunking strategies
---------
Co-authored-by: Weiler, Ryan <ryanweiler92@gmail.com>
Co-authored-by: kunal0137 <kunal0137@gmail.com>
* feat: allowing for tools message in full prompt (#780)
* UPDATE ::: Add docker in docker Dockerfiler (#784)
* add docker in docker Dockerfile
* Update Dockerfile.dind
Remove python and tomcat arguments from Dockerfile
* fix: remove-paddle-ocr (#786)
* [#595] test(unit): adds unit test for prerna.engine.impl.model.kserve
Co-authored-by: Ryan Weiler <ryanweiler92@gmail.com>
* feat: Tag semoss image (#789)
* adding changes for non-release docker build
* adding non-release build logic to cuda-semoss builder
* updating push branches
* fix: branch names on docker builds
* fix: branch names on docker builds cuda
* fix: adding push condition - change to pyproject toml file; adding event input vars to env vars (#790)
* fix: python builder toml file change (#792)
* fix: Catch errors when calling pixels from Python (#787)
Co-authored-by: Weiler, Ryan <ryanweiler92@gmail.com>
* Creating db links between engines and default apps (#693)
* create db links between engine and default app
* Rename column APPID to TOOL_APP
* feat: add database_tool_app to getUserEngineList
---------
Co-authored-by: Weiler, Ryan <ryanweiler92@gmail.com>
* Adding sort options to the myengines reactor (#479)
* added sort feature to MyEnginesReactor and genericized reactor imports
* formatting
* overloading method
* validate sortList
---------
Co-authored-by: Ryan Weiler <ryanweiler92@gmail.com>
* feat: cleaning up unused imports in MyEngine reactor (#793)
* feat: Create Enum projectTemplate and update CreateAppFromTemplateReactor to accept existing appID for cloning applications (#621)
Co-authored-by: kunal0137 <kunal0137@gmail.com>
* Update GetEngineUsageReactor.java (#417)
Co-authored-by: Maher Khalil <themaherkhalil@gmail.com>
Co-authored-by: Ryan Weiler <ryanweiler92@gmail.com>
* Issue 596: Adds Unit Tests for prerna/engine/impl/model/responses and workers (#727)
* [#596] test(unit): adds unit tests
* fix: implements ai-agents suggestions
---------
Co-authored-by: Jeff Vitunac <jvitunac@gmail.com>
Co-authored-by: Ryan Weiler <ryanweiler92@gmail.com>
* 609 implement native blob storage for azure gcp and aws (#674)
* Initial commit : implementation for azure blob storage
* added dependency for azure in pom.xml
* update logic to fetch the metadata from list details
* changed functionality from listing containers to listing files within a selected container
* initial commit for google cloud storage implementation
* added field contant in enum class and removed unused method
* add methods to parse comma-separated local and cloud paths
* add methods to parse comma-separated local and cloud paths
* implementation for aws s3 bucket
* normalize container prefix path
* merged all: implementation for azure, aws and gcp
* refactor(storage): replace manual path normalization with normalizePath from Utility class
---------
Co-authored-by: pvijayaraghavareddy <pvijayaraghavareddy@WORKSPA-6QV71G7.us.deloitte.com>
Co-authored-by: Parth <parthpatel3@deloitte.com>
Co-authored-by: Ryan Weiler <ryanweiler92@gmail.com>
* Get Node Pool Information for Remote Models (#806)
* 590 unit test for prernaengineimpl (#808)
* test(unit): update to filesystems hijacking for testing files
* test: start of unit tests for abstract database engine
* test(unit): added unit test for prerna.engine.impl
* test(unit): finsihed tests for prerna.engine.impl
* test(unit): adding back unused assignment
---------
Co-authored-by: Ryan Weiler <ryanweiler92@gmail.com>
* Creating WordCountTokenizer Class (#802)
* feat: creating word count tokenizer class && falling back to word count tokenizer if tiktok fails
* feat: updating comment
* feat: setting default chunking method as recursive (#810)
* Unit tests fixes and Unit test Class file location updates (#812)
* test(unit): moved tests to correct packages
* test(unit): fixed a couple of unit tests
* VectorDatabaseQueryReactor: output divider value for word doc chunks always 1 (#804)
* Code implementation for #733
* feat: Added code to resolve Divider page issue
* Console output replaced by LOGGERs as per review comments
* feat: replaced Console with Loggers
---------
Co-authored-by: Varaham <katchabi50@gmail.com>
Co-authored-by: Ryan Weiler <ryanweiler92@gmail.com>
* GetCurrentUserReactor (#818)
Adding GetCurrentUserReactor to return user info including if user is an admin.
* Python User Class (#819)
* fix: trimming properties read from smss; fix: logging commands before executing (#821)
* Updating getNodePoolsInfo() to parse and return zk info and models active actual (#822)
* feat: update get node pool information for zk info and models active actual
* feat: get remote model configs
* Add unit tests for package prerna\engine\impl\vector (#728)
* Create ChromaVectorDatabaseEngineUnitTests.java
* completed tests for ChromaVectorDatabaseEngine class
* [#604] test(unit): Created ChromaVectorDatabaseEngine unit tests
* [604] tests(unit) : Completed test cases for ChromaVectorDatabaseEngine; update File operations to nio operations in ChromaVectorDatabaseEngine.java
* [#604] tests(unit): added unit tests for all vector database engines and util classes in the prerna\engine\impl\vector package
* [604] test(unit): replaced creating file paths with string literals with java.nio Paths.resolve/Paths.get methods
---------
Co-authored-by: Maher Khalil <themaherkhalil@gmail.com>
Co-authored-by: Ryan Weiler <ryanweiler92@gmail.com>
* feat: adding to the return of getenginemetadata (#813)
* feat: adding to the return of getenginemetadata
* fix: removing throws
---------
Co-authored-by: Arash Afghahi <48933336+AAfghahi@users.noreply.github.com>
Co-authored-by: Ryan Weiler <ryanweiler92@gmail.com>
* 718 create a single reactor to search both engines and apps (#794)
* feat(engineProject): Initial commit
* chore: 718 create a single reactor to search both engines and apps
* chore: 718 create a single reactor to search both engines and apps
---------
Co-authored-by: Ryan Weiler <ryanweiler92@gmail.com>
Co-authored-by: Vijayaraghavareddy <pvijayaraghavareddy@deloitte.com>
* feat: update openai wrapper to handle multiple images (#832)
* feat: adding user room map (#840)
* feat: hiding side menu bar for non admins (#833)
* Side menu changes
* Review Comments fixed
* Flag is renamed in Constants.java
* Review Comment fixed in Utility.java
* fix: cleaning up defaults and comments
---------
Co-authored-by: kunal0137 <kunal0137@gmail.com>
---------
Co-authored-by: Maher Khalil <themaherkhalil@gmail.com>
Co-authored-by: kunal0137 <kunal0137@gmail.com>
Co-authored-by: Ryan Weiler <ryanweiler92@gmail.com>
Co-authored-by: ManjariYadav2310 <manjayadav@deloitte.com>
Co-authored-by: dpartika <dpartika@deloitte.com>
Co-authored-by: Raul Esquivel <resmas.work@gmail.com>
Co-authored-by: Pasupathi Muniyappan <pasupathi.muniyappan@kanini.com>
Co-authored-by: resmas-tx <131498457+resmas-tx@users.noreply.github.com>
Co-authored-by: AndrewRodddd <62724891+AndrewRodddd@users.noreply.github.com>
Co-authored-by: radkalyan <107957324+radkalyan@users.noreply.github.com>
Co-authored-by: samarthKharote <samarth.kharote@kanini.com>
Co-authored-by: Shubham Mahure <shubham.mahure@kanini.com>
Co-authored-by: rithvik-doshi <81876806+rithvik-doshi@users.noreply.github.com>
Co-authored-by: Mogillapalli Manoj kumar <86736340+Khumar23@users.noreply.github.com>
Co-authored-by: Jeff Vitunac <jvitunac@gmail.com>
Co-authored-by: pvijayaraghavareddy <pvijayaraghavareddy@WORKSPA-6QV71G7.us.deloitte.com>
Co-authored-by: Parth <parthpatel3@deloitte.com>
Co-authored-by: KT Space <119169984+Varaham@users.noreply.github.com>
Co-authored-by: Varaham <katchabi50@gmail.com>
Co-authored-by: ericgonzal8 <ericgonzalez8@deloitte.com>
Co-authored-by: Arash Afghahi <48933336+AAfghahi@users.noreply.github.com>
Co-authored-by: Vijayaraghavareddy <pvijayaraghavareddy@deloitte.com>
Co-authored-by: ammb-123 <ammb@deloitte.com>
Google Cloud, Azure, and AWS Native Storages - Implementation and Testing
Description
This implementation provides methods for managing files and directories in Google Cloud Storage (GCS), Azure Blob Storage, and AWS S3 with additional functionalities such as retry operations, rollback capabilities, and synchronization between local storage and cloud storage.
Methods:
Lists all objects in the configured storage bucket (GCS, Azure, or AWS S3).
Lists all objects along with their metadata (size, creation date, etc.) from the configured storage.
Synchronizes the contents of a local directory to a specified path in cloud storage, including:
Removing empty directories locally.
Deleting extra objects from the cloud storage that are not present locally.
Uploading files and syncing.
Synchronizes the contents of a cloud storage path to a specified local directory, including:
Deleting empty objects from cloud storage.
Syncing files to local storage.
Deleting local files that are not present in cloud storage.
Removing empty directories locally.
Copies a specific local file to the configured cloud storage:
Removes empty local directories before upload.
Uploads the file to the cloud storage.
Removes empty objects from the cloud storage after upload.
Copies a specific file from the cloud storage to the local file system:
Deletes empty objects (zero-byte files) from the cloud storage.
Downloads the file to local storage.
Deletes empty local directories after download.
Deletes a file or object from the configured cloud storage (GCS, Azure, or AWS S3).
Deletes objects from the specified path but retains the folder structure if leaveFolderStructure is true.
Recursively deletes all contents from the specified folder in the configured cloud storage (GCS, Azure, or AWS S3).
Additional Functionalities:
Retry Operations: Automatically retries failed upload, download, or delete operations up to 3 times.
Rollback Mechanism:
rollbackUpload: Reverts uploaded files if the operation fails.
rollbackDownload: Reverts downloaded files if the operation fails.
retryDelete: Retries deletion if it fails, up to 3 attempts.
Tracking:
Successfully processed files are added to uploadedFiles, downloadedFiles, and deletedFiles lists.
Failed operations are tracked in failedFiles.
How to Test (Using App Terminal)
Step 1: Initialize the Cloud Storage Engine
1) Google Cloud Storage
Use the following pixel code format to create a storage catalog and connect with Google Cloud Storage:
CreateStorageEngine( storage=["Any_Name"], storageDetails=[{"STORAGE_TYPE":"GOOGLE_CLOUD_NATIVE_STORAGE","NAME":"Any_Name","GCS_SERVICE_ACCOUNT_FILE":"Your_Account_Key","GCS_BUCKET":"Your_Bucket_Name","GCS_PROJECT_ID":"Your_Project_Id"}] )2) Azure Blob Storage
Use the following pixel code format to create a storage catalog and connect with Azure Blob Storage:
CreateStorageEngine( storage=["Any_Name"], storageDetails=[{"STORAGE_TYPE":"MICROSOFT_AZURE_NATIVE_BLOB_STORAGE","NAME":"Any_Name","AZ_CONN_STRING":"Your_Account_Connection_String"}] )3) AWS S3 Storage
Use the following pixel code format to create a storage catalog and connect with AWS S3 Storage:
CreateStorageEngine( storage=["Any_Name"], storageDetails=[{"STORAGE_TYPE":"AMAZON_S3_NATIVE","NAME":"Any_Name","S3_REGION":"Your_Account_Region","S3_BUCKET":"Your_Bucket_Name","S3_ACCESS":"Your_Account_AccessKey","S3_SECRET":"Your_Account_SecretKey"}] )Step 2: Navigate to Storage Catalog
After successfully creating the storage engine, go inside the Storage Catalog and access the Usage Tab to interact with the Cloud storage.

Step 3: Test Each Method
Use the above pixel code to test each method by providing the required paths as specified in the method signatures.