What do engineering and government have in common?
While one involves barong-clad debates and the other involves hard hats and complex math, government and engineering share a DNA rooted in a single, unsexy word: regulation. In both worlds, regulation isn’t just about “obeying the law”; it’s the structural framework that prevents a system—be it a society or a skyscraper or an application—from collapsing under its own weight.
In the world of Computer Engineering (CpE), the “hard hats” are the logic analyzers and compilers, but the principle remains the same. Government and CpE both rely on regulation to ensure that complex, invisible systems don’t turn into “spaghetti code” that crashes society.

In the Philippines, AI regulation is shifting from “wait-and-see” to a proactive “debug” phase. The government is treating AI not as a magic black box, but as a system that needs the same rigorous Unit Testing and Bug Reporting as any other infrastructure.
The Foundation: Standards and Statutes
At their core, both fields are about managing messy, unpredictable forces. For the engineer, it’s gravity, friction, and fluid dynamics. For the government, it’s human behavior, economic greed, and social friction. Regulation serves as the “Safety Factor.” In engineering, we don’t design a bridge to hold exactly 1,000kg; we design it for 3,000kg just in case. Similarly, government regulations like the Philippine Clean Air Act (R.A. 8749) act as the safety factor for public health, ensuring industries don’t “redline” our environment into oblivion.
In CpE, regulation takes the form of protocols and standards (like IEEE or ISO). Just as the government uses the Data Privacy Act of 2012(R.A. 10173) to set the “rules of the road” for personal information, computer engineers use protocols to ensure that a chip designed in Taiwan can “talk” to a server in Quezon City. Without these “regulations,” the digital world would be a Tower of Babel—lots of noise, but zero connection.

The Philippine government launched the National AI Strategy Roadmap 2.0 (NAISR 2.0) and established the Center for AI Research (CAIR) to lead this charge. This isn’t just about economic growth; it’s about building an “AI economy with conscience”. [1, 2, 3]
To Win or Lose, Nah. To expose
Here is where it gets interesting. We often think of regulation as a “win/lose” scenario—the government wins if they catch you, you win if you bypass the permit. But true engineering and governance follow a different mantra: The goal is to expose.

In engineering, we perform “destructive testing.” We push a material until it snaps. We aren’t trying to “beat” the steel; we are trying to expose its limits so we can build around them. In government, effective regulation—like the Ease of Doing Business Act—isn’t about “winning” against the bureaucracy. It’s about creating a transparent process that exposes bottlenecks and corruption. When the system is regulated and transparent, the flaws have nowhere to hide. You don’t “win” a building permit; the process “exposes” whether your design is safe for the Filipino people.
When a computer engineer runs a “boundary value analysis” on a piece of firmware, they aren’t trying to make the code “lose.” They are trying to expose the bug before it hits the market. In the same way a government “white hat” hacker might probe a government portal, the goal is to find the vulnerability and bring it into the light. Transparency is the ultimate debugger.
Just like a computer engineer uses stress tests, the Philippine regulatory framework uses specific mechanisms to expose hidden algorithmic biases: [4]
- Mandatory Audits & Assessments: House Bill No. 7913 (the AI Bill of Rights) focuses on protecting citizens from algorithmic bias. It proposes that high-risk systems—like those used in hiring or healthcare—undergo mandatory audits.
- The Transparency Debugger: The National Privacy Commission (NPC) Advisory No. 2024-04 requires “explainability”. This means companies must be able to explain the logic behind an AI’s decision. If an AI rejects a loan for a Filipino with a provincial address, the regulation forces the system to expose why, ensuring the decision isn’t based on discriminatory geographic bias.
- Human-in-the-Loop Verification: For the 2025 elections, COMELEC Resolution No. 11064 mandates the registration of all AI-generated campaign content. This is a “verification” step to expose deepfakes and misinformation before they can influence voters. [2, 4, 5, 6, 7]
Madlang People: “Sample, sample”
Consider the National Building Code of the Philippines (P.D. 1096).
- The Engineering Side: It dictates the precise mix of concrete and the spacing of rebars. It forces the structure to “confess” its weaknesses during an earthquake.
- The Government Side: It ensures that a developer cannot prioritize profit over the life of a tenant.
When a building fails, the regulation wasn’t there to make the developer “lose” money; it was there to expose the fact that the design was insufficient before the ground started shaking.
Think about the Philippine Identification System. This is where CpE and Government regulation have a high-stakes “collaboration”:
- The Engineering Side: Engineers must adhere to strict encryption standards and biometric data regulations. They use Error Detection and Correction codes—not to “win” against data corruption, but to expose and fix it the moment a single bit flips.
- The Government Side: Regulation ensures that this data isn’t used for unauthorized surveillance. The law “exposes” the limits of what the state can and cannot do with your digital footprint.
If the system fails, it’s usually because someone tried to “win” (by cutting costs or rushing the rollout) instead of allowing the regulatory process to expose the system’s technical flaws during the testing phase.
In 2022, it was observed that some hiring platforms in the Philippines deployed algorithms that potentially disadvantaged applicants with non-Filipino surnames. [4]
- The “To Expose” Strategy: Instead of just banning the software, new guidelines from the National Privacy Commission (NPC) require Privacy Impact Assessments (PIA).
- The Result: Developers must “scrub” and regularly retrain their models. The goal is to expose if the training data is over-representing certain demographics (like Metro Manila residents) at the expense of others. [5, 8, 9]
Conclusion: Built to Last
Government and engineering are both exercises in “Applied Trust.” We trust that the floor won’t give way, and we trust that our taxes won’t vanish. Regulation is the mechanism that validates that trust. It’s not a cage; it’s a blueprint. By focusing on exposure over ego, both fields ensure that whether we are building a nation or a flyover in EDSA, the result is built to last.
Whether it’s a legislative bill or a microchip, regulation is the “Unit Test” of civilization. It forces the system to be honest. For the Filipino computer engineer, following a standard isn’t about red tape—it’s about making sure that when a lola in the province uses an app to receive her pension, the system is robust enough to handle the load.
By treating regulation as a method to expose flaws, the Philippines is ensuring that its “Digital Philippines” initiative isn’t built on a foundation of “buggy” logic. It turns the regulator into a Lead QA Engineer—someone whose job isn’t to say “No,” but to say, “Show me how this works so it doesn’t break our society.”
[1] https://www.lumifywork.com

Leave a comment