 
		 
Client
A major educational testing and assessment organization 
Goal
To rationalize and modernize the system for BI reports and deliver a better UX with quality assurance at a reduced cost 
Tools and Technologies
Power BI, Semantic Modelling, MSBI 
Business Challenge
The client had a current set of 400+ Power BI and 600+ SSRS reports. A number of these were obsolete and had to be removed and archived.
Users faced multiple challenges that had to be identified by collating feedback so that appropriate solutions could be developed. The aim was to modernize and rationalize the system to achieve better governance and reduce costs of infra and support services.
 
Solution
- Collated user feedback to pinpoint key challenges and improvement areas
- Identified, removed and archived many obsolete and orphaned reports
- Built a holistic semantic model to support reports with self-serve BI capability
- Introduced new, rationalized, and interactive reports/dashboards
- Enhanced UI/UX design with a provision to introduce branding
- Developed a robust process for data governance and quality assurance
- Created extensive documentation to enable complete user self-serve capabilities
- Trained users on BI self-serve
 
Outcomes
- Achieved 80% reduction in the number of reports post-rationalization
- Ensured 60% reduction in requests/enquiries for data by making it readily available through documentation
- Delivered 60% to 80% reduction in time required to build new custom reports
- Enhanced user confidence in data generated
- Increased number of users and per report user views
 
							Our experts can help you find the right solutions to meet your needs.
Modern system for insurance data management
 
			
			
	 
		 
Client
A large NY-based life insurance and investment company 
Goal
Create a secure, automated solution for data ingestion and a robust framework for distribution across channels 
Tools and Technologies
Python, PySpark, AWS Glue/Redshift/Lambda/S3/Aurora, Stonebranch, Jira, Github 
Business Challenge
The client used a legacy product data infrastructure (PACE) and other systems that provided neither fully-secure access nor enabled efficient quality checks. This affected system integration and data ingestion and distribution.
Workflows and checks were not adequately automated, and they did not offer a reusable framework to generate and deliver outbound data files aligned with business requirements.
 
Solution
- Created reusable and scalable ETL/ELT pipelines using Python and AWS services
- Integrated Stonebranch for orchestration and automated job scheduling, with monitoring mechanisms and alerts
- Tuned Redshift queries and optimized data ingestion processes to reduce latency and improve throughput
- Defined data specifications and output formats as per business needs
- Built a configurable pipeline to create dynamic CSV/Excel files from Redshift views
- Automated file delivery via email/SFTP monitored and orchestrated by Stonebranch
 
Outcomes
- Improved data distribution and a reusable framework for ingestion and distribution of data across existing and new products
- Streamlined operations and improved data accessibility
- Enhanced performance and scalability
- Ensured better data quality and governance with automation and structured reusability
 
							Our experts can help you find the right solutions to meet your needs.
Modern insurance product data services platform
 
			
			
	 
		 
Client
A large NY-based life insurance and investment services provider 
Goal
Create a unified, scalable architecture for secure, standardized sharing of data with downstream services 
Tools and Technologies
Java 21, Spring Boot, AWS, Jenkins, Stonebranch, Jira, Microservices, Redshift, Aurora, DynamoDB, Angular, JS, MYSQL, EKS, SQS 
Business Challenge
The client used a product data mart that lacked a secure, standardized method to share data with downstream services. This affected data governance and consistency across applications. It also faced the risk of disrupting tightly integrated legacy data flows while shifting from a legacy PACE system that supports critical AEM microservices to a PDP mart.
It needed a new system that is scalable, and the modernization process had to be carried out with minimal disruption to existing systems.
 
Solution
- Implemented a dual-solution strategy to modernize data access and delivery. Key elements included: - Built a microservices-based platform and deployed on AWS
- Enabled secure, flexible API access to the product data mart using tagged identifiers
- Ensured a scalable design with minimal code changes for expansion
- Allowed existing AEM microservices to operate without changes during and post-transition
 
 
Outcomes
- Improved efficiency and security by enabling standardized, governed access to product and entity data
- Achieved faster delivery and reduced manual effort
- Ensured seamless integration while modernizing the backend
- Maintained front-end stability of AEM microservices
 
							Our experts can help you find the right solutions to meet your needs.
Modern insurance platform for loan processing
 
			
			
	 
		 
Client
A large NY-based life insurance and investment company 
Goal
Modernize product data management services to enhance loan processing and improve user experience 
Tools and Technologies
Angular 15, .Net 4.8, .NET 8.0, AWS EC2, SQL Server, PostgreSQL, Stonebranch, SharePoint, IIS 
Business Challenge
The client operated a critical legacy application for loan processing, which had significant operational inefficiencies and shortcomings in user experience. It hindered the speed and accuracy of loan disbursements, affecting both internal operations and customer satisfaction.
The goal was to modernize the platform with scalable solutions that could enable secure distribution of critical data and better governance, while integrating smoothly with downstream applications.
 
Solution
- Implemented a holistic modernization approach with a focus on system integration, functional enhancements, and UI transformation
- Streamlined deal creation and approval through a third-party system with robust verification and compliance features
- Enabled multi-loan support under a single deal
- Ensured consistency by introducing a standardized component library with reusable UI components
- Revamped the interface with intuitive design, responsive layouts, and improved user experience
 
Outcomes
- Enhanced efficiency, scalability, and user-centric loan operations
- Improved ROI through greater efficiency, better user experience, and agility
- Reduced verification effort by 30–55% and enabled focus on higher-value tasks
- Streamlined processes and minimized data duplication
- Ensured consistent design and efficient development
- Increased customer satisfaction, with a projected 50% rise in retention
 
							Our experts can help you find the right solutions to meet your needs.
Eagle Access Data Platform Transforms Accounting
 
 
 
Client
Large NY-based life insurance and investment company 
Goal
Consolidate multiple accounting systems into a centralized, reliable data warehouse to improve reporting and decision-making 
Tools and Technologies
BNY Eagle Access, Python, Oracle, AWS EC2 and S3, SQL Server, Jira, Stonebranch 
Business Challenge
The insurer faced growing inefficiencies due to siloed accounting systems that lacked integration, consistency, and scalability. Reporting processes were time-consuming, error-prone, and lacked real-time visibility—hindering timely business and investment decisions. A centralized solution was needed to ingest and unify data from disparate systems like SAP GL, Singularity, and Loan Management into a single, trusted platform to support strategic financial insights and reduce operational complexity.
 
Solution
- Built a centralized accounting data warehouse using the Eagle Access secure private cloud environment
- Ingested and standardized data from SAP GL, Singularity, and Loan Management systems
- Used AWS EC2, S3, and Stonebranch Universal Automation Controller for cloud infrastructure and job orchestration
- Enabled real-time reporting via Tableau integration and migration of legacy dashboards
- Improved data accuracy and consistency through robust validation and automation
 
Outcomes
- Created a unified source of truth for all accounting data
- Enabled faster, more accurate reporting and analytics, improving business and investment decision-making
- Reduced data silos and improved accessibility across systems
- Minimized infrastructure complexity and operational risk with secure private cloud hosting
- Enhanced efficiency through automated data processing and orchestration
 
							Our experts can help you find the right solutions to meet your needs.
Gen AI platform offers future-ready capabilities
 
			
			
	 
		BANKING & FINANCIAL SERVICES
Gen AI platform offers future-ready capabilities
 
Client
A leading North American bank 
Goal
Provide a wide range of AI capabilities for various risk and business teams and avoid building fragmented, outdated systems 
Tools and Technologies
Amazon Bedrock and Titan V2, pgvector, Faiss, OpenSearch, Llama 7B, Claude Sonnet 3.5 and 3.7 
Business Challenge
The Enterprise Risk function at a leading North American bank initiated a Generative AI (Gen AI) solution to offer a wide range of AI capabilities, including document intelligence, summarization, generation, translation, and more.
As the project evolved through proofs of concept and pilots, a key challenge emerged: the risk of creating a fragmented ecosystem with an overwhelming array of unmanageable bespoke solutions, model integrations, and reliance on potentially outdated models and libraries.
 
Solution
Based on prior engagements across clients, our team delivered thought leadership around how to develop and deliver capabilities using a platform approach. We also set up a Minimum Viable Product Team to iterate on new problem areas and solution approaches. Platform development includes generalized capabilities for:
- Setting up document ingestion pipelines, with choice of parsing approaches, embedding models and vector index stores
- A factory model along with configurations for integrating new parsers, embedding models, LLM interfaces etc., to quickly bring new capabilities to the platform
- User management, SSO integration, entitlements management
- API integration to bring in information/ data from internal and external sources
- Platform support of pgvector, Faiss, OpenSearch, Amazon Titan V2, Llama 7B, Claude Sonnet 3.5, and 3.7, etc.
- Intuitive chat interface for AI Masters - designated business users trained in Prompt Engineering and other techniques to assemble new AI/Gen AI capabilities for users through configuration - and end users
 
Outcomes
- A future-ready Gen AI platform that can easily incorporate new capabilities and updates
- Multiple specific capabilities, called skills, for use by various risk teams and business users
- A forward-looking roadmap, including ability to compose more complex capabilities using atomic capabilities
 
							Our experts can help you find the right solutions to meet your needs.
Data migration to cloud expedites credit risk functions
 
			
			
	 
		 
Client
A leading North American bank 
Goal
Migrate credit risk data and SAS-based analytics models from on-premises data warehouse to AWS to enhance functionality 
Tools and Technologies
AWS Glue, Redshift, DataSync, Athena, CloudWatch, SageMaker; Apache Airflow; Delta Lake; Power BI 
Business Challenge
The credit risk unit of a major bank aimed to migrate SAS-based analytics models containing data for financial forecasting and sensitivity analysis to Amazon SageMaker.
This was to leverage benefits such as enhanced scalability, improved maintenance for MLOps engineers, and better developer experience. It also sought to migrate credit risk data from a Netezza-based on-premises data warehouse to AWS, utilizing a data lake on AWS S3 and a data warehouse on Redshift to support model migration.
 
Solution
- Decoupled data workload processing from relational systems using the phased approach with a focus on historical migration, transformational complexities, data volumes, and ingestion frequencies of the incremental loads
- Developed a flexible ETL framework using DataSync for extracting data to AWS as flat files from Netezza
- Transformed data in S3 layers using Glue ETL and moved it to the Redshift data warehouse
- Enabled Glue integration with Delta Lake for incremental data workloads
- Built ETL workflows using Step Functions during orchestration and concurrent runs of the workflow; orchestrated the concurrent runs of workflows using Apache Airflow
- Architected data shift from Netezza to AWS, leveraging a flexible ETL framework
 
Outcomes
- Enhanced financial forecasting and sensitivity analysis operations with analytical models and data migrated to the AWS public cloud
- Expedited time-to-market catering to client’s downstream consumption needs through Power BI and Amazon SageMaker
 
							Our experts can help you find the right solutions to meet your needs.
Custom analytics enable faster business decisions
 
 
 
Client
U.S.-based asset management company 
Goal
Streamline and improve data and analytics capabilities for enhanced user experiences 
Technology Tools
Java, React JS, MS SQL Server, Spring Boot, GitHub, Jenkins 
Business Challenge
The client captures voluminous data from multiple internal and external sources. The absence of quick, on-demand capabilities for business users was inefficient in generating customized portfolio analytics on attributes such as average quality, yield to maturity, average coupon, etc.
The client teams were spending enormous amounts of manual effort and elapsed time (approximately 12-15 hours) to respond to requests for proposals from their respective clients.
 
Solution
Iris implemented a data acquisition and analytics system with pre-processing capabilities for grouping, classifying, and handling historical data.
A data dictionary was established for key concepts, such as asset classes and industry classifications, enabling end users to access data for analytical computation. The analytics engine was refactored, optimized, and integrated into the streamlined investment performance data infrastructure.
The team developed an interactive self-service capability, allowing business users to track data availability, perform advanced searches, generate custom analytics, visualize information, and utilize the insights for decision-making.
 
Outcomes
The solution brought several benefits to the client, including:
- Simplified data access to generate custom analytics for end users
- Eliminated manual processing and the need for complex queries
- Enhanced the stakeholder experience
- Reduced response time to client RFPs by over 50%
 
							Our experts can help you find the right solutions to meet your needs.
Big Data platform improves global AML compliance
 
 
 
Client
A leading global bank with operations in over 100 countries 
Goal
Address data quality and cost challenges of legacy AML application infrastructure 
Tools and Technologies
Hadoop, Hive, Talend, Kafka, Spark, ETL 
Business Challenge
The client’s legacy AML application infrastructure was leading to data acquisition, quality assurance, data processing, AML rules management and reporting challenges. 
High data volume and rules-based algorithms were generating high numbers of false positives. Multiple instances of legacy vendor platforms were also adding to cost and complexity.
 
Solution
Iris developed and implemented multiple AML Trade Surveillance applications and Big Data capabilities. The team designed a centralized data hub with Cloudera Hadoop for AML business processes and migrated application data to the big data analytical platform in the client’s private cloud. Switching from a rule-based approach to algorithmic analytical models, we incorporated a data lake with logical layers and developed a metadata-driven data quality monitoring solution. 
We enabled the support for AML model development, execution and testing/validation, and integration with case management. Our data experts also deployed a custom metadata management tool and UI to manage data quality. Data visualization and dashboards were implemented for alerts, monitoring performance, and tracking money laundering activities.
 
Outcomes
The implemented solution delivered tangible outcomes, including:
- Centralized data hub capable of handling 100+ PB of data and ~5,000 users across 18 regional hubs for several countries
- Ingestion of 30+ million transactions per day from different sources
- Greater insights with scanning of 1.5+ Billion transactions every month
- False positives reduced by over 30%
- AML data storage cost reduced to <10 cents per GB per year
- Extended support to multiple countries and business lines across six global regions; legacy instances reduced from 30+ to <10
 
							Our experts can help you find the right solutions to meet your needs.
Investment warehouse enhances communications
 
 
 
Client
A U.S.-based investment bank 
Goal
Improve data collation and information quality for enhanced marketing and client reporting functions 
Tools and Technologies
Composite C1, Oracle DB, PostgreSQL, Vermilion Reporting Suite, Python, MS SQL Server, React.js 
Business Challenge
The client’s existing investment data structure lacked a single source of truth for investment and performance data. The account management and marketing teams were making significant manual efforts to track portfolio performance, identify opportunities and ensure accurate client reporting. The time-consuming and manual processes of generating marketing exhibits and client reports were highly error-prone.
 
Solution
Iris implemented a comprehensive investment data infrastructure for a single source of truth and improved reporting capabilities for marketing content and client report generation. 
An automated Quality Assurance process was instituted to validate the information in critical marketing materials, such as fact sheets, snapshots, sales kits, and flyers, against the respective data source systems. 
Retail and institutional portals were developed to provide a consolidated view of portfolios, with the ability to drill down to underlying assets, AUM (Assets Under Management) trends, incentives, commissions, and active opportunities.
 
Outcomes
The new data infrastructure delivered a holistic, on-demand view of investment details, including performance characteristics, breakdowns, attributions, and holdings, to the client's marketing team and account managers with:
- ~95% reduction in performance data and exhibit information discrepancies
- ~60% improvement in operational efficiency in core marketing and client reporting functions
 
							Our experts can help you find the right solutions to meet your needs.
Industries
Company
 
Bring the future into focus.
