Large Language Models (LLMs)
Generative AI for Quantum and
General Purpose Applications
Bespoke LLM solutions integrated with real world operational workflows.
The Floquet team will create AI/ML pipelines for even the most complex domain problems, taking full advantage of the best open-source and AWS cloud platform toolsets. Whether it’s ETL, meta-data handling, pre and post-processing, result-set collation, or distribution, Floquet has the expertise to optimise the utility of the underlying AI/ML tuning and control mechanisms.
Sovereign Australian Capability
- Assurances re non-sovereign ingress/egress
- Canberra-based (ANU Momentum members)
- Federal and State Government clients
AWS Partners
AWS Bedrock
AWS SageMaker
AWS Q
Bespoke Applications
- Integrated ‘QnA’ Smart ChatBots
- Tailored Custom Document Generation
- Automated Data Cleansing at scale
- Personalised Semantic Search
- Novel AI usages e.g. Impartial Arbitration
Technical Aspects of LLM operational chains
LLM Access and Integration
- Model flexibility to maximise cost and benefit
- Well-defined guardrails for determination, quality assurance
LLM and ANN Optimisation
- Vector databases for content retrieval
- Model tuning
- Cost management and monitoring
DevSecOps Capabilities
- Fully secure end-to-end GitLab CI/CD
- Automated regression testing
- Best-of-Breed Toolsets
Enhanced Tailored Workflows
- Agents for increased flexibility
- RAG and prompt wrapping
- Chain-based responsive workflows
Enhanced Tailored Workflows
- Serverless
AI/ML solutions - Big Compute algorithms
- Integration with Floquet ANNabler AI/ML platform
Support & Maintenance
- Solution Operationalisation
- On-call ‘code cutting’ Helpdesk