Federal judge temporarily blocks the Pentagon from branding AI firm Anthropic a supply chain risk

SAN FRANCISCO (AP) — A federal judge has ruled in favor of artificial intelligence company Anthropic in temporarily blocking the Pentagon from labeling the company as a supply chain risk.

U.S. District Judge Rita Lin on Thursday said she was also blocking President Donald Trump’s directive ordering all federal agencies to stop using Anthropic.

Lin’s ruling followed a 90-minute hearing in San Francisco federal court on Tuesday at which Lin questioned why the Trump administration took the extraordinary step of denouncing Anthropic as a supply chain risk after negotiations over a defense contract went sour over the company’s attempt to prevent its AI technology from being deployed in fully autonomous weapons or surveillance of Americans.

Anthropic, maker of the chatbot Claude, had asked Lin to issue an emergency order to remove a stigma that the company alleges was unjustifiably applied as part of an “unlawful campaign of retaliation” that provoked the San Francisco-based company to sue the Trump administration earlier this month. The Pentagon had argued that it should be able to use Claude in any way it deems lawful.

Lin said her ruling was not about that public policy debate but about the government’s actions in response to it.

                        Related Stories
                    
                

        
    
    
    
    







    
        

                
                    



    
        


  




    




    




    




    




    




    




    




    



    




    
    
    
    

    

    





    
        

            
            
            Federal judge extends order requiring access to lawyers for Minnesota immigration detainees
        

    

  

    

    
    







    
    
        
        
    
    
    
    
        

            3 MIN READ
        

    
    
    
    







    

    

    

    

    




                
            

    
        

                
                    



    
        


  




    




    




    




    




    




    




    




    



    




    
    
    
    

    

    





    
        

            
            
            Closing some US airports due to TSA staffing would have big consequences, experts say
        

    

  

    

    
    







    
    
        
        
    
    
    
    
        

            5 MIN READ
        

    
    
    
    







    

    

    

    

    




                
            

    
        

                
                    



    
        


  




    




    




    




    




    




    




    




    



    




    
    
    
    

    

    





    
        

            
            
            Conflict in eastern Congo is escalating with use of heavy weapons and drones, UN warns
        

    

  

    

    
    







    
    
        
        
    
    
    
    
        

            2 MIN READ

“If the concern is the integrity of the operational chain of command, the Department of War could just stop using Claude. Instead, these measures appear designed to punish Anthropic,” Lin wrote.

Anthropic has also filed a separate and more narrow case that is still pending in the federal appeals court in Washington, D.C.

Lin wrote that her order is delayed for a week and doesn’t require the Pentagon to use Anthropic’s products or prevent it from transitioning to other AI providers.

This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin