Securing AI Adoption in Government: From Mandates to Implementation

One of today’s top trends is artificial intelligence (AI), specifically how the Public Sector can adopt it while maintaining the security, governance and oversight essential for mission-critical operations. With AI jumping from number three to one on Federal Chief Information Officers鈥 (CIO) priority lists and 80% of CIOs under explicit cost savings mandates, the question is no longer whether to deploy AI but how to do so securely at scale.

The recent overhaul of the Federal Acquisition Regulation (FAR) marks the most significant rewrite in over 40 years, fundamentally shifting how Federal agencies operate and procure technology. As generative AI (GenAI) deployments move into mission-critical environments, agencies need practical frameworks that balance speed with verification.

Moving From Speed to Velocity

As The Public Sector enters the age of AI, with $4 trillion in Private Sector investment in data centers, agencies face a fundamental design challenge: design AI systems that adapt to human workflows rather than forcing humans to adapt to systems. This distinction matters most in Government and defense contexts where lives depend on maintaining human oversight for deliberate decisions.

The Department of War鈥檚 (DoW) Acquisition Transformation Strategy (ATS) offers a proven model of buying outcomes in increments. Instead of funding calendar time through traditional program structures, agencies should fund missions through portfolios that deliver outcomes in weeks or months. This approach structures procurement in modular increments that integrate with evolving architecture while funding capability and delivery, not timelines.

Velocity differs from speed in its directional precision. Agencies can accelerate procurement through fast-lane processes while maintaining governance through evidence gates that verify operational performance, user adoption, cyber risk posture and sustainment realities. This framework preserves ethical obligations while delivering measurable results.

Prerequisites for Secure AI Implementation

Before deploying AI tools in production environments, agencies need foundational elements in place:

GitLab, Securing AI Adoption in Government blog, embedded image, 2026

Policy frameworks that define where AI can be a part of the process and establish clear boundaries for all personnel. Training and enablement programs ensure teams understand governance requirements and security policies. Several Federal agencies have already created AI centers of excellence to help establish standards and create processes around how they are implementing AI.

End-to-end visibility across the entire software delivery process enables agencies to track where AI agents operate and what actions they perform. Without comprehensive visibility, governance becomes theoretical rather than operational.

Contextual accuracy determines output quality, AI systems deliver accurate, usable results only when provided with the right context, making data quality and integration critical prerequisites.

Built-in guardrails must exist before AI implementation. Security scans on every code change and controls preventing critical vulnerabilities from merging into production branches become essential as agencies move into the agentic AI era.

Practical AI Use Cases That Deliver Value

GitLab鈥檚 most recent DevSecOps survey reports that AI currently handles about 25% of the work in Public Sector organizations, with leadership targeting 50% automation. The most successful implementations focus on code generation, testing and documentation, areas where AI delivers immediate, measurable impacts.

Federal customers using GitLab鈥檚 AI capabilities report significant efficiency gains in code review processes. AI-powered first-pass reviews reduce time while maintaining quality standards. Test generation and legacy code modernization have proven particularly effective.

Compliance automation represents an emerging high-value use case. GitLab teams are developing compliance agents that access code repositories, Continuous Integration/Continuous Deployment (CI/CD) pipelines and security vulnerability data to automatically populate Security Technical Implementation Guide (STIG) checklists. Security team leaders review and adjust outputs as necessary, reducing administrative burden while allowing teams to focus on strengthening application security posture.

Prioritizing AI Governance Frameworks

With 35% of Public Sector professionals using unofficial AI tools at work, agencies governance frameworks that address shadow IT risks without stifling innovation. A risk-based approach identifies high-impact systems within critical infrastructure and implements controls that prevent systemic failures.

Effective governance prioritizes AI adoption around innovation while maintaining public trust. Agencies must identify high-impact areas and understand system interdependencies, as more systems connect, understanding how one system impacts another becomes essential for appropriate segmentation and risk management.

Building on Secure Foundations

Agencies cannot build on a shaky foundation. Federal AI and cybersecurity strategies must align around building responsibility into the process from the start. This requires shifting from governing static systems to engineering systems that can evolve safely, integrating assurances, accountability and human judgment as foundational design constraints instead of downstream checks.

Before deploying advanced AI capabilities, agencies should strengthen foundational practices, standardizing workflows, implementing security by design and ensuring basic guardrails are in place. AI cannot compensate for weak foundations in the software development lifecycle. The path forward requires doubling down on fundamentals while strategically adopting AI where it delivers clear value.

To learn more about implementing secure AI solutions, watch GitLab鈥檚 full webinar, 鈥Cyber in the AI Era: Building Foundations for Secure Adoption.鈥

探花视频. is The Trusted Government IT Solutions Provider, supporting Public Sector organizations across Federal, State and Local Government agencies and Education and Healthcare markets. As the Master Government Aggregator鈥痜or our vendor partners, including GitLab, we deliver鈥solutions鈥痜or Geospatial, Cybersecurity, MultiCloud, DevSecOps, Artificial Intelligence, Customer Experience and Engagement, Open Source and more. Working with resellers, systems integrators and consultants, our sales and marketing teams provide industry leading IT products, services and training through hundreds of contract vehicles. Explore the 探花视频 Blog to learn more about the latest trends in Government technology markets and solutions, as well as 探花视频鈥檚 ecosystem of partner thought-leaders.

Building a DevSecOps Culture

As software becomes more sophisticated, it plays an increasingly important role in all aspects of government operations. However, given the complexity and intertwined nature of modern software, any vulnerability could have wide-ranging consequences, which makes security of vital importance. The federal government has taken notice. A number of recent policy directives address issues related to the software supply chain, and key agencies are leading a governmentwide effort to promote secure software development, including the Executive Order on Transforming Federal Customer Experience and Service Delivery to Rebuild Trust and the Executive Order on Improving the Nation鈥檚 Cybersecurity. Learn how you can implement DevSecOps to support your journey to secure, innovative software in 探花视频鈥檚 Innovation in Government庐 report.

 

The Mindset Shift that Enables DevSecOps

鈥淚n an ideal world, technology and processes support team members鈥 ability to deliver on their particular talents. Before agencies implement DevSecOps methodologies, they should identify where their processes are getting bottlenecked and forcing people to either work around them or fundamentally change their behavior. Instead, we want to make it easy for employees to do the right thing. The goal is to enable people to focus on what they do best 鈥 regardless of where they operate in the stack or the tools they are using 鈥 so that agencies can build and deploy secure, modern apps.鈥

Read more insights from Alex Barbato, Public Sector Solutions Engineer at VMware.

 

How Generative AI Improves Software Security 听

探花视频 FCW July DevSecOps Blog Embedded Image 2023鈥淕enerative AI tools are becoming increasingly prevalent, providing interactive experiences that captivate the public鈥檚 imagination. These tools are accessible to anyone, offering a unique opportunity to engage and explore the creative possibilities enabled by AI technology. The technology doesn鈥檛 just train a model to recognize patterns. It can create things that are easy to understand: images, text, even videos. Sometimes the results are hilariously wrong, but other times the results are quite impressive, such as clear, concise answers to complex questions. Generative pre-trained transformer (GPT) technology, such as ChatGPT, has opened the doors for everyone to be an evaluator because the output is accessible and easy to critique.鈥

Read more insights from Robert Larkin, Senior Solutions Architect at Veracode.

Open Source is at the Heart of Software Innovation

鈥淓mbedding security into applications from the start is essential for streamlining and strengthening the entire development life cycle. Securing the software supply chain is a related effort that is of vast importance to government operations. Beyond securing individual applications, the ultimate goal is to build security into the pipeline itself. At each step and every handoff, we must be able to verify who has touched the software and who did what to ensure that the end result is what we intended to build and that nothing malicious has been injected along the way.鈥

Read more insights from Chris Mays, Staff Specialist Solutions Architect at Red Hat.

 

DevSecOps Needs Tool Diversity and Collaboration

鈥淎s DevSecOps methodologies and software factories grow in prevalence, agencies are recognizing that software development is a team sport 鈥 inside the agency, across departments and with external stakeholders. It touches many different teams, but getting everyone on the same page with tooling can be difficult. Different teams prefer different tools, and that makes collaboration hard. Modern software development brings security practices forward in the timeline while reducing duplication of efforts and improving real-time accountability. Success hinges on removing blockers, creating visibility and making sure collaboration is happening at every stage. In addition, encouraging input from different areas of the organization from the beginning and throughout development is vital for innovation.鈥

Read more insights from Ben Straub, Head of Public Sector at Atlassian.

 

Observability Speeds Zero Trust and Application Security

鈥淚n response to increasing cyberthreats, the government is speeding up the move to zero trust. This security model assumes that every user, request, application and non-human entity is not to be trusted until its identity can be verified. Zero trust principles require a layered defense that is more effective when rooted in observability. To develop an architecture that validates and revalidates every entity on the network, it is necessary to know what those entities are, how they鈥檙e communicating and how they typically behave so we can recognize deviations. Zero trust and observability technologies work together to create a more secure and resilient network environment by assuming that all requests for access are untrusted and continuously monitoring the network to detect and respond to potential threats.鈥

Read more insights from Willie Hicks, Public Sector Chief Technologist at Dynatrace.

 

The Role of a Service Mesh in Zero Trust Success

鈥淔or large companies and government agencies, it鈥檚 safe to assume that a committed attacker is already inside their networks. Executive Order 14028 mandates that every federal agency develop a Zero Trust architecture because it is the most effective approach to mitigating what attackers can do once they鈥檝e made their way inside. What does Zero Trust look like at runtime? One of the key considerations is identity-based segmentation, which involves conducting five policy checks for every request in the system: encrypted connection between service endpoints, service authentication, service-to-service authorization, end user authentication, and end user-to-resource authorization.鈥

Read more insights from Zack Butcher, Founding Engineer at Tetrate and co-author of the NIST SP 800-200 series and SP 800-207A.

 

AI and the Journey to Secure Software Development

鈥淏y automating and optimizing DevSecOps workflows, we can still shift security left while relieving developers from the burden of some complex remediation. It begins with a workflow that leverages fully automated security scanning to rapidly identify vulnerabilities as well as providing suggested remediation for vulnerabilities and on-demand remediation training to educate developers on what they are getting into. The rapid evolution of artificial intelligence is making new advances possible. The opportunities go well beyond AI-assisted code creation. AI features are being expanded across the entire software development life cycle. When it comes to security, having AI assist by making code functionality clear or explaining a vulnerability in detail reduces the time required to remediate risk.鈥

Read more insights from Joel Krooswyk, Federal CTO at GitLab.

 

Scaling App Development While Meeting Security Standards

鈥淭he dream for any software development team is constant, stable releases. The faster teams get the work they鈥檝e created into production, the faster the agency can derive value from that work. When app development is stymied by cumbersome security reviews and stability testing and by the need to wait for a deployment window, innovation is stifled and the return on investment is delayed. If agencies want to have efficient, value-driving software development teams, those teams must be able to move with agility. A trustworthy, scalable DevOps pipeline that brings together testing and security in a seamless way allows teams to push out new apps and improvements quickly so government employees and citizens can have a seamless digital experience and the most up-to-date tools and information.鈥

Read more insights from Kyle Tobener, Head of Security and IT at Copado.

 

鈥攁n exciting day of exhibits, speaking sessions, and networking events. We look forward to showcasing new DevSecOps updates from our supporting panels featuring government, systems integrators, and industry thought leaders.

Download the full Innovation in Government庐 report for more insights from DevSecOps thought leaders and additional industry research from FCW.

The Open Source Revolution in Government

Open source technology accounts for a significant portion of most modern applications, with some estimates going as high as 90%, and it is the foundation of many mainstream technologies. Its strength lies in the fact that a vibrant ecosystem of developers contribute to and continually improve the underlying code, which keeps the software dynamic and responsive to changing needs. Enterprise open source software further augments these community-driven projects by providing enterprise-grade support and scalability, while retaining the innovation and flexibility driven by the open source development model. By providing the best of both worlds, such solutions represent a powerful arsenal of tools for addressing government鈥檚 most pressing challenges. In a recent pulse survey of FCW readers, 93% of respondents said they were using open source technology. And more than half of respondents to FCW鈥檚 survey see open source as an integral resource for strengthening cybersecurity. That number reflects a positive trend toward a better understanding of open source software鈥檚 intrinsic approach to security. The power of enterprise open source technologies lies in a combination of collaboration, transparency and industry expertise. As agencies expand their use of such technologies, they maximize their ability to achieve mission success in the most secure, agile and innovative way possible. Learn how the combined power of community-driven innovation and industry-leading technical support is expanding the government鈥檚 capacity for transformation in 探花视频鈥檚 Innovation in Government庐 report.

 

Why Open Source is a Mission-Critical Foundation

IIG FCW Open Source Revolution November Blog Embedded Image 2022鈥淥pen source transforms the way agencies manage hybrid and multi-cloud environments. The most critical technology in the cloud, across all providers, is Linux. Everything is built on top of that foundation 鈥 both the infrastructure of the cloud and cloud offerings. Given the right partner, the promise of Linux is that it provides a consistent technology layer for agencies across all footprints, including multiple cloud providers, on-premises data centers and edge environments. From that foundation, agencies and their partners can build portable architectures that leverage other open source technologies. Portability gives organizations the ability to use the same architectures, underlying technologies, monitoring and security solutions, and human skills to manage mission-critical capabilities across all footprints.鈥

Read more insights from Christopher Smith, Vice President and General Manager of the North America Public Sector at Red Hat.

 

How Open Source is Expanding its Mission Reach

鈥淭he real power of open source technologies was revealed when they cracked the code on being highly powered, mission-specific, distributed systems. That鈥檚 how we are able to get insights out of data by being able to hold it and query it. Today, open source innovation is being accelerated by the cloud, and the conversation is still changing, with people now demanding that their open source companies be cloud-first platforms. Along the way, the open source technologies that start in the community and then receive a boost of commercial innovation have matured. The most powerful ones are expanding their ability to address more of the government鈥檚 mission needs. They are staying interoperable and keeping the data interchange non-proprietary, which is important for government agencies.鈥

Read more insights from David Erickson, Senior Director of Solutions Architecture at Elastic.

 

The Open Source Community鈥檚 Commitment to Security 听

鈥淎 central tenet of software development is visibility and traceability from start to finish so that a developer can follow the code through development, testing, building and security compliance, and then into the final production environment. Along the way, there are some key activities that boost collaboration and positive outcomes, starting with early code previews, where developers can spin up an application for stakeholders to review. Other activities include documented code reviews by peers to ensure the code is well written and efficient. In addition, DevOps components such as open source, infrastructure as code, Kubernetes as a deployment mechanism, automated testing, and better platforms and capabilities have helped developers move away from building ecosystems and instead focus on innovation.鈥

Read more insights from Joel Krooswyk, Federal CTO at GitLab.

 

The Limitless Potential of an Open Source Database

鈥淥ne of the most important elements of any database migration is ensuring that proper planning and due diligence have been performed to ensure a smooth and successful deployment. In addition, there are some key considerations agencies should keep in mind when moving to open source databases. It is essential to start with a clear understanding of the business case and objectives for adopting an open source approach. Agencies also need to decide how the database should function and what it should do to support their digital transformation. Then they must choose the optimal method to deploy the database.鈥

Read more insights from Jeremy A. Wilson, CTO of the North America Public Sector at EDB.

 

Modernizing Digital Services with Open Source

鈥淎 composable, open source digital experience platform (DXP) enables agencies to overcome those challenges. Open source technology is continuously contributed to by a community of developers to reflect a wide array of needs across organizations in varying industries and of varying sizes. A composable approach allows agencies to assemble a number of solutions for a fast, efficient system that is tailored to their needs. When agencies combine a composable DXP with open source technology, they have access to best-of-breed software and the ability to customize the assembly to suit their requirements. An enterprise DXP will enable agencies to achieve a 360-degree view of how constituents are engaging with their digital services and gain valuable data to understand how to enhance their experience. Finally, a composable, open source DXP provides a proactive approach to protecting against security and compliance vulnerabilities.鈥

Read more insights from Tami Pearlstein, Senior Product Marketing Manager at Acquia.

 

Creating Secure Open Source Repositories

鈥淧rotecting the software supply chain requires looking at every single thing that might come into an agency鈥檚 environment. To understand that level of visibility, I like to use the analogy of a refrigerator. All the ingredients necessary to make a cake or pie are in the refrigerator. We know they are of good quality, and other teams can use them instead of having to find their own. At Sonatype, our software equivalent of a refrigerator is the Nexus Repository Manager. A second aspect of our offering, called Lifecycle, allows us to evaluate the open source components in repositories at every stage of the software development life cycle. One piece of software can download a thousand other components. How do we know if one of those components is malicious?鈥

Read more insights from Maury Cupitt, Regional Vice President of Sales Engineering at Sonatype.

 

Better Data Flows for a Better Customer Experience

鈥淎 more responsive and personalized customer experience isn鈥檛 much different from the initial problem set that gave birth to Apache Kafka. When people interact with agencies, they want those agencies to know who they are and how they鈥檝e interacted in the past. They don鈥檛 want to be asked for their Social Security number three times on the same phone call. They also expect that the information or service they receive will be the same whether they are accessing it over the phone, via a mobile app and on a website. To elevate the quality of their service, agencies must be able to stream information in a low-friction way so different systems are consistent with one another and up-to-date at all times, regardless of the communication channel an individual uses. President Joe Biden鈥檚 executive order about transforming the federal customer experience is based on this capability. The most successful companies across industries have figured out how to do it, and for the most part, they鈥檝e done it with open source software.鈥

Read more insights from Jason Schick, General Manager of Confluent US Public Sector.

 

An Open Source Approach to Data Analytics

鈥淔or the past 40 years, agencies have used data warehouses to collect and analyze their data. Although those warehouses worked well, they were limited in what they could do. For instance, they could only handle structured data, but by some estimates, 90% of agencies鈥 data is unstructured and in the form of text, images, audio, video and the like. Furthermore, proprietary data warehouses can show agencies what has happened in the past but can鈥檛 predict what might happen in the future. To achieve the government鈥檚 goal of evidence-based decision-making, agencies need to be able to tap into all their data and predict what might come next.鈥

Read more insights from Howard Levenson, Regional Vice President at Databricks.

 

Download the full Innovation in Government庐 report for more insights from these open source thought leaders and additional industry research from FCW.