The following are short summaries of recent CBP NY rulings issued by the agency's National Commodity Specialist Division in New York:
Obaidullah Syed, a Chicago-based technology executive, pleaded guilty to illegally exporting computer equipment to a nuclear research agency owned by the Pakistani government, the U.S. Attorney's Office for the Northern District of Illinois said. Syed owned Business System International USA, based in Chicago, along with Business System International stationed in Pakistan, both of which produce high-performance computing platforms, servers and software application solutions. From 2006 to 2015, Syed conspired with other BSI employees in Pakistan to violate the International Emergency Economic Powers Act by shipping computer equipment to the Pakistan Atomic Energy Commission without obtaining prior authorization from the Commerce Department, the U.S. Attorney's Office said. The PAEC is a government agency responsible for designing and testing explosives and nuclear weapons parts, and was designated by the U.S. as an entity that could pose a national security threat, the U.S. Attorney's Office said. Syed and his co-conspirators lied to U.S.-based computer manufacturers that his shipments were headed to Pakistani universities or Syed's other businesses, Syed admitted. BSI Pvt. Ltd. was also charged as a corporate defendant.
Although the Bureau of Industry and Security's new export controls on cybersecurity items are intended to restrict only malicious exports, they could place wide-ranging compliance burdens on the entire cybersecurity sector, law firms said. Technology companies and others operating in the sector still have time to convince BIS to narrow the scope of the rule, which takes effect in January but contains several “ambiguities,” firms said.
Technologies are emerging to combat deepfakes, but rules might be needed, panelists said at a Tuesday webinar hosted by the Convention of National Associations of Electrical Engineers of Europe (EUREL). Deepfake technology enabled some beneficial uses, but it's increasingly difficult to distinguish between real and fake content and people, said Sebastian Hallensleben, chairman of German EUREL member VDE e.V. One common argument is that AI fabrications aren't a problem because we can use other AI systems to detect them. As deepfakes become more sophisticated, there will be more countermeasures, causing a "detection arms race," said Hallensleben. What's needed is a "game-changer" to show what's real online and what isn't, Hallensleben said. He's working on "authentic pseudonyms," identifiers guaranteed to belong to a given physical person and to be singular in a given context. This could be done through restricted identification along the lines of citizens' ID cards; a second route is through self-sovereign identity (SSI). If widely used, authentic pseudonyms would avoid the "authoritarian approach" to deepfakes, Hallensleben said. SSI is a new paradigm for creating digital ID, said Technical University of Berlin professor Axel Kupper. The ID holder (a person) becomes her own identity provider and can decide where to store her identity documents and what services to use. The infrastructure is a decentralized, tamper-proof, distributed ledger. The question is how to use the technology to mitigate the use of automated content creation, Kupper said. Many perspectives besides technology must be considered for cross-border identification infrastructure, including regulation, governance, interoperability and social factors, said Tanja Pavleska, a researcher at the Joef Stefan Institut Laboratory for Open Systems and Networks in Slovenia. Trust applies in all those contexts, she said. Asked whether the proposed EU AI Act should classify deepfakes as high-risk technology, she said such fakes aren't just done by a single player or type of actor, so rules aimed at single points might be difficult. All panelists agreed the EU general data protection regulation should be interpreted to cover voice and facial data.
Technologies are emerging to combat deepfakes, but rules might be needed, panelists said at a Tuesday webinar hosted by the Convention of National Associations of Electrical Engineers of Europe (EUREL). Deepfake technology enabled some beneficial uses, but it's increasingly difficult to distinguish between real and fake content and people, said Sebastian Hallensleben, chairman of German EUREL member VDE e.V. One common argument is that AI fabrications aren't a problem because we can use other AI systems to detect them. As deepfakes become more sophisticated, there will be more countermeasures, causing a "detection arms race," said Hallensleben. What's needed is a "game-changer" to show what's real online and what isn't, Hallensleben said. He's working on "authentic pseudonyms," identifiers guaranteed to belong to a given physical person and to be singular in a given context. This could be done through restricted identification along the lines of citizens' ID cards; a second route is through self-sovereign identity (SSI). If widely used, authentic pseudonyms would avoid the "authoritarian approach" to deepfakes, Hallensleben said. SSI is a new paradigm for creating digital ID, said Technical University of Berlin professor Axel Kupper. The ID holder (a person) becomes her own identity provider and can decide where to store her identity documents and what services to use. The infrastructure is a decentralized, tamper-proof, distributed ledger. The question is how to use the technology to mitigate the use of automated content creation, Kupper said. Many perspectives besides technology must be considered for cross-border identification infrastructure, including regulation, governance, interoperability and social factors, said Tanja Pavleska, a researcher at the Joef Stefan Institut Laboratory for Open Systems and Networks in Slovenia. Trust applies in all those contexts, she said. Asked whether the proposed EU AI Act should classify deepfakes as high-risk technology, she said such fakes aren't just done by a single player or type of actor, so rules aimed at single points might be difficult. All panelists agreed the EU general data protection regulation should be interpreted to cover voice and facial data.
Last year's White House cybersecurity space policy directive (see 2009040042) helped raise awareness of the issue, but public sector and government implementation has been lagging, said George Washington University Space Policy Institute Director Scott Pace on a CompTIA panel Tuesday. CompTIA Senior Director-Public Sector David Logsdon said the National Cybersecurity Center's Space Information Sharing and Analysis Center plans to report in November on perceived gaps in the space policy directive. Pace said he had hoped space agencies would have started talking more explicitly in acquisitions and requests for proposals about cybersecurity expectations. Until such principles start being part of competitive considerations in acquisitions, "it's hard to get companies to start taking that seriously," he said, noting interagency discussions are needed. He said government should be more aggressive in industry outreach with Department of Homeland Security threat briefings, and more active in international engagement via standards bodies. Added Logsdon, “If we don't do it, the Chinese will." The space policy directive deliberately took "a soft approach" instead of a prescriptive one, to get grassroots buy-in, said Lockheed Martin Vice President-Technology Policy and Regulation Jennifer Warren. She said there's more to be done in adoption and implementation, but the directive had some success in raising awareness about the need to think of cybersecurity beyond just satellites to the broader ecosystem including earth stations and supply chains. Timelines for implementation should be aspirational, with voluntary steps companies could take "to get that gold star." A lot of focus has been on technical issues like standards and nomenclature, but more thought should go to nontechnical issues of personnel security and insider threats, Pace said. "Every traitor in prison had a security clearance." Viasat Government Systems Chief Technology Officer Phil Mar urged paying more attention to smaller, emerging space companies, where cybersecurity often is a last-minute concern.
President Joe Biden extended a national emergency that authorizes certain sanctions against people and entities in the Democratic Republic of the Congo, the White House said Oct. 25. The country is “marked by widespread violence and atrocities that continue to threaten regional stability.” The emergency was extended for one year beyond Oct. 27.
Senators told us they believe there's a feasible if narrow legislative window to reconfirm FCC Chairwoman Jessica Rosenworcel this year, act on fellow Democratic commission pick Gigi Sohn and affirm NTIA administrator nominee Alan Davidson. The White House announced President Joe Biden’s intent to choose the trio Tuesday, as expected. Biden designated Rosenworcel Tuesday as permanent chair. She had been acting head since January. The White House also nominated Winston & Strawn patent lawyer Kathi Vidal as Patent and Trademark Office director.
The Commerce Department needs to address several “urgent shortcomings” in its export control policies toward China (see 2110180016) and impose stricter export restrictions and license denials for sensitive goods and suppliers of Chinese military companies, a group of Republican lawmakers said in a letter to Secretary Gina Raimondo. The 17 Republicans, all members of the House’s China Task Force, also said the Bureau of Industry and Security should commit to a timeline for releasing more emerging and foundational technology controls and issue “appropriate” restrictions on fundamental research and open-source technology platforms.
The Treasury Department expects to issue more crypto-related sanctions and allocate more resources to better target the digital assets of cybercriminals, Treasury Deputy Secretary Wally Adeyemo said. Speaking during a Center for a New American Security Event event last week, Adeyemo also said the agency is placing a high priority on multilateral designations and is hoping to better understand trading partners’ concerns about U.S. secondary sanctions.