Monday, June 24, 2019



BIG UPDATE: YouTube has REMOVED the video from their platform. The video is still available on this website page.
UPDATE 1: Congressman Louie Gohmert issued a statement, saying “Google should not be deciding whether content is important or trivial and they most assuredly should not be meddling in our election process. They need their immunity stripped…”
UPDATE 2: Google executive Jen Gennai RESPONDED to the video, saying, “I was having a casual chat with someone at a restaurant and used some imprecise language. Project Veritas got me. Well done.” 
Insider: Google “is bent on never letting somebody like Donald Trump come to power again.”
Google Head of Responsible Innovation Says Elizabeth Warren “misguided” on “breaking up Google”
Google Exec Says Don’t Break Us Up: “smaller companies don’t have the resources” to “prevent next Trump situation”
Insider Says PragerU And Dave Rubin Content Suppressed, Targeted As “Right-Wing”
LEAKED Documents Highlight “Machine Learning Fairness” and Google’s Practices to Make Search Results “fair and equitable”
Documents Appear to Show “Editorial” Policies That Determine How Google Publishes News
Insider: Google Violates “letter of the law” and “spirit of the law” on Section 230


(New York City) — Project Veritas has released a new report on Google which includes undercover video of a Senior Google Executive, leaked documents, and testimony from a Google insider.  The report appears to show Google’s plans to affect the outcome of the 2020 elections and “prevent” the next “Trump situation.”
The report includes undercover footage of longtime Google employee and Head of Responsible Innovation, Jen Gennai saying:
“Elizabeth Warren is saying we should break up Google. And like, I love her but she’s very misguided, like that will not make it better it will make it worse, because all these smaller companies who don’t have the same resources that we do will be charged with preventing the next Trump situation, it’s like a small company cannot do that.”
Jen Gennai
Said Project Veritas founder James O’Keefe:
“This is the third tech insider who has bravely stepped forward to expose the secrets of Silicon Valley.  These new documents, supported by undercover video, raise questions of Google’s neutrality and the role they see themselves fulfilling in the 2020 elections.”
Jen Gennai is the head of “Responsible Innovation” for Google, a sector that monitors and evaluates the responsible implementation of Artificial Intelligence (AI) technologies.  In the video, Gennai says Google has been working diligently to “prevent” the results of the 2016 election from repeating in 2020:
“We all got screwed over in 2016, again it wasn’t just us, it was, the people got screwed over, the news media got screwed over, like, everybody got screwed over so we’re rapidly been like, what happened there and how do we prevent it from happening again.”
“We’re also training our algorithms, like, if 2016 happened again, would we have, would the outcome be different?”
Google: Artificial Intelligence Is For A “fair and equitable” State
According to the insider, Machine Learning Fairness is one of the many tools Google uses to promote a political agenda.  Documents leaked by a Google informant elaborate on Machine Learning Fairness and the “algorithmic unfairness” that AI product intervention aims to solve:
Google Exposed
Click to enlarge
Google Exposed
Click to enlarge
The insider showed Google search examples that show Machine Learning Fairness in action.
Google Machine Learning Fairness
Click to enlarge

“The reason we launched our A.I. principles is because people were not putting that line in the sand, that they were not saying what’s fair and what’s equitable so we’re like, well we are a big company, we’re going to say it.” – Jen Gennai, Head Of Responsible Innovation, Google

The Google insider explained the impact of artificial intelligence and Machine Learning Fairness:
“They’re going to redefine a reality based on what they think is fair and based upon what they want, and what and is part of their agenda.”
Determining credible news and an editorial agenda. . .
Additional leaked documents detail how Google defines and prioritizes content from different news publishers and how its products feature that content.  One document, called the “Fake News-letter” explains Google’s goal to have a “single point of truth” across their products.

Google Exposed
Click to enlarge
Another document received by Project Veritas explains the “News Ecosystem” which mentions “editorial guidelines” that appear to be determined and administered internally by Google.  These guidelines control how content is distributed and displayed on their site.
Google Exposed
Click to enlarge
The leaked documents appear to show that Google makes news decisions about what news they promote and distribute on their site.
Comments made by Gennai raise similar questions.  In a conversation with Veritas journalists, Gennai explains that “conservative sources” and “credible sources” don’t always coincide according to Google’s editorial practices.
“We have gotten accusations of around fairness is that we’re unfair to conservatives because we’re choosing what we find as credible news sources and those sources don’t necessarily overlap with conservative sources …” 
The insider shed additional light on how YouTube demotes content from influencers like Dave Rubin and Tim Pool:
“What YouTube did is they changed the results of the recommendation engine. And so what the recommendation engine is it tries to do, is it tries to say, well, if you like A, then you’re probably going to like B. So content that is similar to Dave Rubin or Tim Pool, instead of listing Dave Rubin or Tim Pool as people that you might like, what they’re doing is that they’re trying to suggest different, different news outlets, for example, like CNN, or MSNBC, or these left leaning political outlets.”

Internal Google Document: “People Like Us Are Programmed” 
An additional document Project Veritas obtained, titled “Fair is Not the Default” says “People (like us) are programmed” after the results of machine learning fairness.  The document describes how “unconscious bias” and algorithms interact.

Click to enlarge
Veritas is the “Only Way”
Said the insider:
“The reason why I came to Project Veritas is that you’re the only one I trust to be able to be a real investigative journalist.  Investigative journalist is a dead career option, but somehow, you’ve been able to make it work.  And because of that I came to Project Veritas because I knew that this was the only way that this story would be able to get out to the public.”

“I mean, this is a behemoth, this is a Goliath, I am but a David trying to say that the emperor has no clothes. And, um, being a small little ant I can be crushed, and I am aware of that. But, this is something that is bigger than me, this is something that needs to be said to the American public.”
Project Veritas intends to continue investigating abuses in big tech companies and encourages more Silicon Valley insiders to share their stories through their Be Brave campaign.
As of publishing, Google did not respond to Project Veritas’ request for comment.  Additional leaked Google documents can be viewed HERE.
Other insider investigations can be viewed here:

Friday, June 21, 2019

Why the “National Popular Vote” scheme is unconstitutional

Why the “National Popular Vote” scheme is unconstitutional

Why the “National Popular Vote” scheme is unconstitutional
This article first appeared in the Daily Caller.


The U.S. Supreme Court says each state legislature has “plenary”
(complete) power to decide how its state’s presidential electors are
chosen.


But suppose a state legislature decided to raise cash by selling its
electors to the highest bidder. Do you think the Supreme Court would
uphold such a measure?


If your answer is “no,” then you intuitively grasp a basic principle
of constitutional law—one overlooked by those proposing the “National Popular Vote Compact” (NPV).


NPV is a plan to change how we elect our president. Under the plan,
each state signs a compact to award all its electoral votes to the
presidential candidate who wins the national popular vote. The compact
comes into effect when states with a majority of presidential electors
sign on.


In assessing the constitutionality of NPV, you have to consider some
of its central features. First, NPV abandons the idea that presidential
electors represent the people of their own states. Second, it discards
an election system balanced among interests and values
in favor of one recognizing only national popularity. That popularity
need not be high: A state joining the NPV compact agrees to assign its
electors to even the winner of a tiny plurality in a multi-candidate
election.


Third, because NPV states would have a majority of votes in the
Electoral College, NPV would effectively repeal the Constitution’s
provision for run-off elections in the House of Representatives.


Fourth, NPV requires each state’s election officer to apply the vote
tabulations certified by other state election officers—even if those
tabulations are known to be fraudulent or erroneous.  Indeed, NPV would
give state politicians powerful incentives to inflate, by fair means or
foul, their vote totals relative to other states.


Don’t changes that sweeping require a constitutional amendment?


In answer to this question, NPV advocates point out that the Constitution seemingly gives state legislatures unlimited authority to decide how their electors are appointed. They further note that the Constitution recognizes
the reserved power of states to make compacts with each other. Although
the Constitution’s text requires that interstate compacts be approved
by Congress, NPV advocates claim congressional approval of NPV is not
necessary. They observe that in U.S. Steel v. Multistate Tax Comm’n
(1978) the Supreme Court held that Congress must approve a compact only
when the compact increases state power at the expense of federal power.


NPV advocates may be wrong about congressional approval. It is unclear that the justices would follow U.S. Steel’s
ruling now. The Constitution’s language requiring congressional
approval is crystal clear, and the court today is much more respectful
of the Constitution’s text and historical meaning than it was in 1978.
Moreover, you can make a good argument that U.S. Steel requires
congressional approval for NPV because NPV would weaken federal
institutions: It would (1) abolish the role of the U.S. House of
Representatives in the electoral process and (2) alter the presidential
election system without congressional involvement. Furthermore, even the
U.S. Steel case suggested that compacts require congressional approval whenever they “impact . . . our federal structure.”


A more fundamental problem with NPV, however, is that with or without
congressional approval it violates a central principle of
constitutional law.


The Constitution recognizes two kinds of powers: (1) those reserved
by the Tenth Amendment in the states by reason of state sovereignty
(“reserved powers”) and (2) those created and granted by the
Constitution itself (“delegated powers”). Reserved powers are, in James
Madison’s words, “numerous and indefinite,” but delegated powers are
“few and defined.”


A state’s power to enter into a compact with other states is reserved
in nature, and it almost always involves other reserved powers, such as
taxation and water use. Such was the compact examined by the Supreme
Court in the U.S. Steel case.


As for delegated powers, the Constitution grants most of these to agents of the federal government. However, it also grants some to entities outside the federal government. Recipients include state legislatures, state governors, state and federal conventions, and presidential electors.


The scope of delegated powers is “defined” by the Constitution’s
language, construed in light of its underlying purpose and its
historical context. If state lawmakers or officers try to employ a
delegated power in a way not sanctioned by its purpose and scope, the
courts intervene.


For example, the courts often have voided
efforts to exercise delegated powers in the constitutional amendment
process in ways inconsistent with purpose or historical understanding.
This is true even if the attempt superficially complies with the
Constitution’s text.


Like a state legislature’s authority to act in the amendment process,
its power to decide how electors are appointed is a delegated one. In
exercising it, the legislature must comply with the overall purpose of
the presidential election system and the historical understandings
surrounding it. For example, the Founders, including those who approved
the 12th amendment, designed the system to serve multiple
interests, not merely candidate popularity. And they conceived of an
elector as a person who acted on behalf of the people of his state—much
like a legislator, but with more limited functions.


In deciding how electors are appointed, state lawmakers may choose
among a range of procedures. But they have a constitutional duty to
choose a method consistent with the electoral system’s purpose and
design. Attempting to convert electors into agents of other states—like selling them to the highest bidder—would be an unconstitutional breach of public trust.


Electoral College Rules Made Simple (or, rather, less complicated)—2nd in a Series

Electoral College Rules Made Simple (or, rather, less complicated)—2nd in a Series

  • November 30, 2017
  • Rob Natelson
Electoral College Rules Made Simple (or, rather, less complicated)—2nd in a Series
The first article
in this series surveyed the problems the framers encountered in
crafting a mode for choosing the president and how they addressed those
problems. This installment explains in detail the Constitution’s
compressed and technical language as it was understood after adoption of
the Twelfth Amendment in 1804. Variations between the original understanding and modern practice are noted in this article.


The Constitution initially provided that after the choice of the
president the person with the most electoral votes would become vice
president. This might be the second-highest electoral vote-getter, but
not necessarily: If the election was thrown into the House of
Representatives because no candidate had won a majority of the electoral
votes, the House could elect any of the five top vote getters. If the
House did not elect the top vote getter, then that person would become
vice president.


There was some sense behind this system. Many founders were concerned
about the risk of a “cabal”—informally organized political
intrigue—between the president and the powerful, relatively small
Senate. Installing the president’s leading rival as vice president, and
therefore as the Senate’s presiding officer, might check that risk.


Nevertheless, chaos during the 1800 election persuaded the founding
generation to add the Twelfth Amendment, providing that electors would
vote separately for president and vice president. The Constitution’s
resulting structure is as follows:


* Article II, Section 1, Clause 1: term of office.

* Article II, Section 1, Clause 2: appointment, number, and qualification of electors.

* The Twelfth Amendment: manner of holding election. (This phrase is explained below.)

* Article II, Section 1, Clause 4: time of election.

* Article II, Section 1, Clause 5 & Twelfth Amendment: qualifications of president and vice president.


Rather than treat each of these in order, it is easier to follow the structure of election rules as the founding generation thought about them.


During the founding era, election rules were said to fix the manner of election (sometimes called the “mode of election”). This term embraced the following five categories:


(1) The time of election, including (a) the term of office and (b) the time for voting.

(2) The qualifications of the voters.

(3) The qualifications of the candidates.

(4) The place of election, including (a) the boundaries of election districts and (b) the location of the polls.

(5) The manner of holding elections. The framers coined this phrase
to cover all the administrative details in the “manner of election”
other than time, qualifications, and place. It included the required
margin of victory (majority or plurality), how votes were cast, oaths,
vote counting and reporting, and election-day conduct. “Manner of
holding” also embraced the number of election stages—one stage for
direct elections, and two or more for indirect elections.


The “manner of holding elections” did not include the kind of omnibus
campaign regulation Congress presumes to impose today. Campaign
regulation was a state power. The modern Supreme Court says that “manner of holding” includes campaign regulations, but the court has never adequately supported this assertion.


In both Britain and America, the manner of election was governed by
statute under the general police power. However, the Constitution did
not leave the entire manner of election to either the state legislatures
or to Congress. The Constitution created a two-or-three stage
presidential election system and then prescribed at least some rules for
each stage.


Time of election. Article II, Section 1, Clause 1
specifies that the terms of the president and vice president are four
years. (This was supplemented by the Twentieth Amendment,
which fixed days of beginning and ending.) Otherwise, the timing of
elections is left to state law, except that Congress may fix a uniform
day for choice of electors (Stage #1) and for their balloting (Stage
#2).


Qualifications of Voters. For Stage #1, the
Constitution allows the states to set voter qualifications, although
this rule has been modified by several constitutional amendments and a
host of Supreme Court rulings. The agency for decision on this and other
issues is the state legislature.  Founding-era practice (as well as subsequent court decision)
tells us that this use of “legislature” refers to the state entire
lawmaking apparatus, including any roles for the governor or popular
referenda. Thus, the use of “legislature” in the case of elections is
different from the use of that word in some other parts of the
Constitution, such as Article V, where it means only the representative
assembly itself.


The Constitution also left to the state legislatures the
qualifications for presidential electors, except that they cannot be
members of Congress or federal officeholders. Stage 3 is the
congressional run-off, so the voter qualification at this stage is to be
a member of the House (to vote for president) or the Senate (to vote
for vice president).


Qualifications of Candidates. The president and vice
president must be natural born citizens, residents of the U.S. for the
prior 14 years, and at least 35 years old. Unlike lawmakers in most
states, the framers specified no qualifications based on property, race,
or gender. This was a conscious decision.


The place of election. With one exception, the place
of election at Stages 1 and 2, was left to the state legislatures.
(After the Constitution was ratified, the states adopted a mixture of
at-large and district voting.) The exception was that presidential
electors were to meet in their respective states rather than congregate
together. That was to minimize the risk of mob or “stampeding” behavior.
Stage 3 congressional runoffs are held in the national capital.


The manner of holding elections. State legislatures
generally determine Stage 1 procedures. They may reserve the power to
choose electors or delegate it to the people. They decide whether the
rule of decision is a majority or a plurality. One writer
has suggested that the founders expected the states to adopt a majority
rule, but I have not found much evidence to support this.


Similarly, the states determine the method of voting. During the
founding era, there were four in common use: (1) viva voce, (2) show of
hands, (3) polling (in which voters filed past a registrar, verified
their identity, and stated their preference), and (4) “ballot”—which
invariably meant secret ballot.


Most Stage 2 procedures also are set by state law, but the
Constitution limits state discretion more than at Stage 1. The electors’
voting must be by ballot. Each elector votes once for president and
once for vice president; of those two candidates at least one must from
another state. The latter rule was adopted for two reasons: (1) to
prevent large states from dominating the electors and (2) the theory
that the second choice after a “favorite son” was likely to be the
better candidate.


The electors are supposed to count the ballots, list how many votes
for each candidate, sign and certify the lists, and transmit them to the
president of the Senate at the national capital. At a joint session of
Congress, the president of the Senate opens the certificates and
arranges for a count. The rule of victory at this level is a majority of
electors appointed. If no candidate receives a majority, a
congressional run-off is necessary.


The Constitution prescribes Stage 3 run-off procedures in even
greater detail. If it appears that no candidate for president commands a
majority of electoral votes, the House of Representatives must “choose
immediately, by ballot, the President” from among the top three vote
getters. (The Twelfth Amendment changed the number from five to three.)
The quorum is at least one Representative present from each of two
thirds of the states. Voting is by state delegations, on a one state/one
vote basis. Election is by a majority of all states, not merely a
majority of states present.


If no candidate wins a majority of the electors for vice president,
the Senate selects from the top two candidates, with a quorum of two
thirds and the rule of victory being a majority of “the whole number of
Senators.”


Each part of this intricate system was adopted for good reasons. For
example, voting by states in the House prevents a few populous states
from dominating the election. The quorum of two thirds and the
majority-to-win requirement assures that the victor enjoys wide popular
support.

Why Did the Framers Create the Electoral College?—1st in a Series

Why Did the Framers Create the Electoral College?—1st in a Series

Why Did the Framers Create the Electoral College?—1st in a SeriesColorado
went Democrat in the last presidential election. But three of those
elected as presidential electors wanted to vote for someone other than
Hillary Clinton. Two eventually cast ballots for Clinton under court
order, while one—not a party to the court proceedings—opted for Ohio
Governor John Kasich, a Republican. After this “Hamilton elector” voted,
state officials voided his ballot and removed him from office. The other electors chose someone more compliant to replace him.


Litigation over
the issue still continues, and is likely to reach the U.S. Supreme
Court. Moreover, President Trump’s victory in the Electoral College,
despite losing the popular vote, remains controversial. So it seems like
a good time to explore what the Electoral College is, the reasons for
it, and the Constitution’s rules governing it. This is the first of a
series of posts on the subject.


The delegates to the 1787 constitutional convention found the
question of how to choose the federal executive one of the most
perplexing they faced. People who want to abolish the Electoral College
usually are unfamiliar with how perplexing the issue was—and still is.


Here are some of the factors the framers had to consider:


* Most people never meet any candidates for president. They have very
little knowledge of the candidates’ personal qualities. The framers
recognized this especially would be a problem for voters considering
candidates from other states. In a sense, this is less of a concern
today because, unlike in 1787, we have mass media through which
candidates can speak directly the voters. In other ways, however, it is more
of a concern than it was in 1787. Our greater population renders it
even less likely for any particular voter to be personally familiar with
any of the candidates. And, as I can testify from personal experience,
mass media presentations of a candidate may be 180 degrees opposite
from the truth. One example: media portrayal of President Ford as a
physically-clumsy oaf. In fact, Ford had been an all star athlete who remained physically active and graceful well into old age.


* Voters in large states might dominate the process by voting only for candidate from their own states.


* Generally speaking, the members of Congress would be in a much
better position to assess potential candidates than the average voter.
And early proposals at the convention provided that Congress would elect
the president. However, it is important for the executive to remain
independent of Congress—otherwise our system would evolve into something
like a parliamentary one rather than a government of three equal
branches. More on this below.


* Direct election would ensure presidential independence of
Congress—but then you have the knowledge problem itemized above. In
addition, there were (and are) all sorts of other difficulties
associated with direct election. They include (1) the potential of a few
urban states dictating the results, (2) greatly increased incentives to
electoral corruption (because bogus or “lost” votes can swing the
entire election, not just a single state), (3) the possibility of
extended recounts delaying inauguration for months, and (4) various
other problems, such as the tendency of such a system to punish states
that responsibly enforce voter qualifications (because of their reduced
voter totals) while benefiting states that drive unqualified people to
the polls.


* To ensure independence from Congress, advocates of congressional
election suggested choosing the president for only a single term of six
or seven years. Yet this is only a partial solution. Someone elected by
Congress may well feel beholden to Congress. And as some Founders
pointed out, a president ineligible for re-election still might cater to
Congress simply because he hopes to re-enter that assembly once he
leaves leaves office. Moreover, being eligible for re-election can be a
good thing because it can be an incentive to do a diligent job. Finally,
if a president turns out to be ineffective it’s best to get rid of him
sooner than six or seven years.


* Elbridge Gerry of Massachusetts suggested election by the state
governors. Others suggested election by state legislatures. However,
these proposals could make the president beholden to state officials.


* The framers also considered election of the president by electors
elected by the people on a strict population basis. Unless the Electoral
College were very large, however, this would require electoral
districts that combined states and/or cut across state lines. In that
event, state law could not effectively regulate the process. Regulation
would fall to Congress, thereby empowering Congress to manipulate
presidential elections.


* In addition to the foregoing, the framers had to weigh whether a
candidate should need a majority of the votes to win or only a
plurality. If a majority, then you have to answer the question, “What
happens if no candidate wins a majority?”On the other hand, requiring
only a plurality might result in election of an overwhelmingly unpopular
candidate—one who could never unite the country. The prospect of
winning by plurality would encourage extreme candidates to run with
enthusiastic, but relatively narrow, bases of support. (Think of the
possibility of a candidate winning the presidency with 23% of the vote,
as has happened in the Philippines.)


The delegates wrestled with issues such as these over a period of
months. Finally, the convention handed the question to a committee of
eleven delegates—one delegate from each state then participating in the
convention. It was chaired by David Brearly, then
serving as Chief Justice of the New Jersey Supreme Court. The committee
consisted of some of the most brilliant men from a brilliant
convention. James Madison of Virginia was on the committee, as was John Dickinson of Delaware, Gouverneur Morris of Pennsylvania, and Roger Sherman of Connecticut, to name only four of the best known.


Justice Brearly’s “committee of eleven” (also called the “committee
on postponed matters”) worked out the basics: The president would be
chosen by electors appointed from each state by a method determined by
the state legislature. It would take a majority to win. If no one
received a majority, the Senate (later changed to the House) would
resolve the election.

EPA replaces failed Clean Power Plan

EPA replaces failed Clean Power Plan





EPA replaces failed Clean Power Plan

By


|June 20th, 2019|Energy|0 Comments
The Trump administration has officially ended Obama’s war on coal.


On Wednesday, June 19, EPA Administrator Andrew Wheeler signed the
Affordable Clean Energy plan, also known as “ACE.” ACE is the
replacement to Obama’s infamous Clean Power Plan, which wreaked havoc on
energy-job dependent communities around the nation.


And CFACT was there in support!


Along with several members of Congress and dozens of coal miners,
CFACT’s student leader at George Mason University, David Bucarey, was
there at EPA headquarters to applaud the Trump administration’s efforts.


“I really loved the opportunity to come in to DC and see this great
reform be enacted,” said David, a sophomore at George Mason, which is
located in northern Virginia. “I sat next to and spoke to some of the
coal miners who were affected, and I really appreciate what they do in
providing energy for our country. Thank you to CFACT for helping me to
get more involved.”


President Trump had signed an order in 2017 directing the EPA to
remove the Clean Power Plan. This ceremony was the fulfillment of that
directive by EPA officials.


The new ACE rule
is designed to give states more power over deciding how to regulate
energy, meaning those states dependent on coal or gas can keep those
much-needed energy producing jobs available.


“The Affordable Clean Energy rule — ACE — gives states the regulatory
certainty they need to continue to reduce emissions and provide
affordable and reliable energy for all Americans,” Administrator Wheeler
stated.


“The contrast between our approach and the Green New Deal, or plans
like it, couldn’t be clearer. Rather than Washington telling Americans
what type of energy they can use, or how they can travel, or even what
they can eat, we are working cooperatively with the states to provide
affordable, dependable, and diverse supply of energy that continues to
get cleaner and more efficient.”


EPA replaces failed Clean Power Plan, CFACT Collegians attend in support 2



David
also got to meet and talk with Congressman Glenn Thompson of
Pennsylvania’s 15th congressional district, which relies heavily on
coal.


“I’ve never met a congressman before,” David said. “That was pretty cool.”


At one point during the proceedings, the speakers recognized all the
coal miners in attendance, who were mostly from Pennsylvania, and gave
them a standing ovation.


Despite the optimism from inside the meeting, Democratic Attorneys
General plan to sue the EPA over this new rule. These include New York
and Connecticut, but more states are sure to join in.


“This is yet another prime example of this administration’s attempt
to rollback critical regulations that will have devastating impacts on
both the safety & health of our nation,” said New York’s Attorney
General Letitia James in a tweet.


Gina McCarthy, Obama’s former EPA Administrator who put the Clean
Power Plan in place several years ago, said “I do not dispute any
administration coming in with different policies, but the challenge I
think we’re facing is they are really changing the rules of the road and
not using sound science.”


But since President Trump pulled the United States out of the Paris Climate Accord, US CO₂ emissions have been dropping, while European CO₂ emissions have been on the rise, despite Europe being the poster child for “green” energy policies.


In fact, according to the Statistical Review of World Energy by BP,
the United States led the world in CO₂ emissions reduction in 2017,
while the European Union’s emissions were actually up by 1.5%.


CFACT has long advocated for the repeal and replacement of the Clean Power Plan at both the federal and grassroots levels.


Passionate CFACT student leaders like David will continue to do the
hard work of rallying support for much-needed free market reforms like
this one on college campuses around the country. George Mason University
in particular, with its close proximity to Washington DC, is a perfect
place to grow this already impactful network of Collegians activists.


Stay tuned for great things from David at George Mason, and all our
incredible Collegians from coast to coast, come the next school semester
this fall.

Unreliable Nature Of Solar And Wind Makes Electricity More Expensive, New Study Finds

Unreliable Nature Of Solar And Wind Makes Electricity More Expensive, New Study Finds



 Unreliable Nature Of Solar And Wind Makes Electricity More Expensive, New Study Finds










Wind turbines in Penonome, Panama. (Credit: Associated Press)


Wind turbines in Penonome, Panama. (Credit: Associated Press)
Solar panels and wind turbines are
making electricity significantly more expensive, a major new study by a
team of economists from the University of Chicago finds.
Renewable Portfolio Standards (RPS) "significantly increase
average retail electricity prices, with prices increasing by 11% (1.3
cents per kWh) seven years after the policy’s passage into law and 17%
(2 cents per kWh) twelve years afterward," the economists write.


The study, which has yet to go through peer-review, was done by
Michael Greenstone, Richard McDowell, and Ishan Nath. It compared states
with and without an RPS. It did so using what the economists say is
"the most comprehensive state-level dataset ever compiled" which
covered 1990 to 2015.


The cost to consumers has been staggeringly high: "All in all,
seven years after passage, consumers in the 29 states had paid $125.2
billion more for electricity than they would have in the absence of the
policy," they write.


Last year, I was the first journalist to report that solar and wind are making electricity more expensive in the United States — and for inherently physical reasons.





Solar and wind require that natural gas
plants, hydro-electric dams, batteries or some other form of reliable
power be ready at a moment’s notice to start churning out electricity
when the wind stops blowing and the sun stops shining, I noted.



And unreliability requires solar- and/or wind-heavy places like Germany, California, and Denmark to pay neighboring nations or states to take their solar and wind energy when they are producing too much of it.


My reporting was criticized — sort of
— by those who claimed I hadn't separated correlation from causation,
but the new study by a top-notch team of economists, including an
advisor to Barack Obama, proves I was right.


Previous studies were misleading, the economists note, because
they didn't "incorporate three key costs," which are the unreliability
of renewables, the large amounts of land they require, and the
displacement of cheaper "baseload" energy sources like nuclear plants.






The higher cost of electricity reflects "the costs that
renewables impose on the generation system," the economists note,
"including those associated with their intermittency, higher
transmission costs, and any stranded asset costs assigned to
ratepayers."


But are renewables cost-effective climate policy? They are not.
The economists write that "the cost per metric ton of CO2 abated exceeds
$130 in all specifications and ranges up to $460, making it at least
several times larger than conventional estimates of the social cost of
carbon."


The economists note that the Obama Administration’s core estimate
of the social cost of carbon was $50 per ton in 2019 dollars, while the
price of carbon is just $5 in the US northeast’s Regional Greenhouse
Gas Initiative (RGGI), and $15 in California’s cap-and-trade system.


Liberals' National Popular Vote Scheme Is Unconstitutional and Dangerous

Liberals' National Popular Vote Scheme Is Unconstitutional and Dangerous



Liberals' National Popular Vote Scheme Is Unconstitutional and Dangerous





 

As
of now, fourteen states have passed the National Popular Vote
Interstate Compact (NPVIC), which attempts to eliminate the Electoral
College as set forth in the United States Constitution.  There have been
many good articles written about the legality of interstate compacts
to achieve the desired National Popular Vote goals.  The author does
not need to rehash all of those problems but believes that there are
three additional ways that the NPVIC is both unconstitutional and
dangerous.




Constitutional Flaw #1: Non-Republican Form of Government



Article IV, Section 4 of the United States Constitution says in part that  "[t]he United States shall guarantee to every State
in this Union a Republican Form of Government."  The United States is a
constitutional republic, where people elect their senators and
representatives at the national level.  At the state level, this is
copied by every state except for Nebraska, which has a unique unicameral
Legislature.  A Republican form of government, by its definition, means
that people elect representatives to represent them in running the
government.  This is done so that the people are not encumbered with the
daily operations and voting to run the state or federal government.




A
fundamental problem with the NPVIC is that it is inherently not a
republican form of government for a specific state to select that
state's Electors.  Once a state Legislature decides to ask its citizens
their preference through a popular vote, there must be a rational basis
as to how the vote of the state's citizens is used to select that
state's electors.  It is not rational that the people's decision could
be overruled by the votes of citizens of unrelated states.  The
following comparison is between two states in the NPVIC who are at the
extremes of the Popular Vote Range for the 2016 election.




Vermont
has three electoral votes in our existing system and cast 315,067 votes
for president in 2016.  This constituted 0.23% of the total votes in
the nation.  Under the NPVIC, Vermont will give other states 99.77% of
the power to select its state's electors for president instead of
maintaining the 100% control it presently has.  Presently, there is a
total pool of 538 electors, and 0.23% constitutes 1.2 electors.  Vermont
has irrationally thrown away its automatic control of three Electoral
Votes for an effective control of 1.2 electoral votes.




At
the other extreme is the state of California, which has 55 electoral
votes in the present system.  In the 2016 presidential election, there
were 14,181,595 votes cast for president in California, which
constituted 10.4% of the nation's total votes.  California will give
other states 89.6% of the power to select its state's electors for
president instead of maintaining the 100% control it presently
has.  California has traded 55 electoral votes for an effective 56
electoral votes.  At least California's decision would result in a
higher number of effective electoral votes for the State, but it would
still hand 89.6% of the decision to other states.




Legislatures
of small states are committing a form of legislative malpractice by
joining the NPVIC.  The NPVIC is the latest in a 250-year history of
schemes where the populous states are trying to bully and dominate the
small states in the country.  Under the guise of the perceived
unfairness of specific presidential election outcomes, the large states
are trying to fool the small states into giving up the finely balanced
power they were guaranteed when they joined the United States.  In
addition, as different states implement different rules for voting, all other states would suffer the corruption of the national popular vote by sanctuary states.  Those states allow non-citizens to vote in some elections and/or make it likely that errors will result in ineligible people voting in presidential elections. 




Constitutional Flaw #2: Popular Vote Coercion



In
1824 (the 10th presidential election in U.S. history), there were four
candidates.  More importantly, there were many ways that states selected
their electoral votes.  In the 1824 election, the states of Delaware,
Georgia, Louisiana, New York, South Carolina, and Vermont did not have
any popular votes for president.  At that time, these states had a total
of 71 electoral votes out of a total of 261.  These states used various
approaches to apportion their electors in the manner they felt
best.  For instance, New York had electors vote for the following
candidates: Andrew Jackson (1), John Quincy Adams (26), Henry Clay (4),
and William Crawford (5).




States have significant flexibility in choosing their electors.  A state could have strong antiwar conscientious objector
feelings and decide that it is morally wrong to select the commander in
chief of the Armed Forces.  In this case, the Legislature could devise a
random process to select electors, or select none at all, so as not to
trample on the feelings of their citizens.  Another state could believe
strongly in astrology
and think birth sign is the most important factor in determining a
commander in chief.  Its Legislature could apportion electors using a
formula based on the birth signs of the candidates.  Though the author
hopes these seem extreme to the reader, it could be argued that they
have a rational basis from the perspective of their state legislatures.




Many
states have used methods other than the popular vote to select their
presidential electors in our nation's history.  The NPVIC would force
states to hold popular votes for president or lose power within our
constitutional republic.  This coercion would occur since unless states
held a popular vote, and their votes were added into the national total,
they would lose power relative to the states in the NPVIC.  In the year
2000, the U.S. Supreme Court re-highlighted the right of state
legislatures to select electors through various means in Bush v. Gore.  These approaches included having the state legislatures take back the ability to choose electors from the people.




Constitutional Flaw #3: Removal of Critical Safety Mechanism



In
the last 13 presidential elections, there have been two where a
third-party candidate received more than 10% of the votes: 1968-Wallace
(13.5%) and 1992-Perot (18.9%).  There were an additional two
presidential elections where a third-party candidate received more than
5% of the vote: 1980-Anderson (6.6%) and 1996-Perot (8.4%).  In addition
to the earlier described 1824 election, the 1860 election in the
lead-up to the Civil War had four major candidates.  In the 1860
Election, the percentages of the popular vote were as follows: Lincoln
(39.8%), Douglas (29.5%), Breckinridge (18.1%), and Bell (12.6%).  The
electoral vote percentages that showed popularity by State were very
different: Lincoln (180), Douglas (12), Breckinridge (72), and Bell
(39).  The electoral votes show that even though Douglas had almost 30%
of the national popular vote, he was the least preferred candidate when
states selected their electors.




As
the number of candidates for election increases, the likelihood of
having an extreme candidate receive the most popular votes goes up
dramatically.  Germany held a federal election in November of 1932, and
the results were as follows: National Socialist German Workers Party
(33.1%), Social Democratic Party of Germany (20.4%), Communist Party
(16.9%), Centre Party (11.9%), and the German National People's Party
(8.3%).  If a fifth major candidate had run for president in the United
States in 1824 or 1860, the percentages could have appeared similar.  If
the reader hasn't figured it out yet, the leader of the National
Socialist German Workers Party was Adolf Hitler.  Even though fewer than
one third of all German voters selected the National Socialist German
Workers Party, the NPVIC approach would have automatically made Adolf
Hitler president with no safety mechanism.




The
Electoral College is only part of the genius of the system our founders
created to select a president.  There is a second step involved if no
candidate receives a majority of the electoral votes.  This has happened
twice (1800 and 1824), but the 1824 case is the more
illustrative.  When no candidate receives a majority of the electoral
votes, the election goes to the United States House of
Representatives.  Each state gets a single vote to choose among the top
three recipients of electoral votes, as specified by the 12th Amendment
to the U.S. Constitution.  In 1824, Andrew Jackson had the most popular
votes, and the most electoral votes, but they were not a majority.




In
1824, Andrew Jackson was a political outsider who was eyed with
distrust in Washington.  When the Election of 1824 went to the U.S.
House of Representatives, the states were allowed to identify the best
compromise candidate they could find from the top three electoral vote
recipients.  The U.S. House voted: John Quincy Adams (13),  Andrew
Jackson (7), and William Crawford (4).  John Quincy Adams, the son of
our country's second president, was elected president by the House of
Representatives in 1824.  In spite of losing the 1824 election in the
House of Representatives, Andrew Jackson came back to win the presidency
outright in the Electoral College in 1828 and 1832, and he is honored
on the $20 bill.




Conclusion



In
the 13 presidential elections that the author can remember, he has felt
emotions ranging from being thrilled, being happy, being worried, and
being disgusted with the results.  Since we live in a great country,
where honest Americans can have different views, the author is sure that
many people felt differently.  Unfortunately, the fact that someone
doesn't like who wins specific elections is no excuse for trying to
dramatically change the genius of our presidential election
system.  This paper shows how the NPVIC would not only be
unconstitutional in three key ways, but would potentially be dangerous
to our nation.

Thursday, June 20, 2019

A shocking scientific discovery: Winds affect ocean temperatures!

A shocking scientific discovery: Winds affect ocean temperatures!





A shocking scientific discovery: Winds affect ocean temperatures!


I
am shocked to learn that winds affect ocean temperatures.  I am also
surprised to learn from the National Oceanic and Atmospheric
Administration (NOAA) that ocean currents, temperatures around the
Earth, and carbon dioxide levels are all related to the Earth's rotation
and the sun and that the ocean currents are on a 1,000-year cycle.




CNN:



Boaty McBoatface makes significant climate change discovery on first mission



The
British research submarine Boaty McBoatface has made an impressive
debut in the scientific arena, discovering a significant link between
Antarctic winds and rising sea temperatures on its maiden outing.



NOAA writes:



Ocean water is on the move, affecting your climate, your local ecosystem, and the seafood that you eat. Ocean currents,
abiotic features of the environment, are continuous and directed
movements of ocean water. These currents are on the ocean's surface and
in its depths, flowing both locally and globally.




Winds,
water density, and tides all drive ocean currents. Coastal and sea
floor features influence their location, direction, and speed. Earth's
rotation results in the Coriolis Effect which also influences ocean currents.




Large-scale surface ocean currents are
driven by global wind systems that are fueled by energy from the sun.
These currents transfer heat from the tropics to the polar regions,
influencing local and global climate. The warm Gulf Stream originating
in the tropical Caribbean, for instance, carries about 150 times more
water than the Amazon River. The current moves along the U.S. East Coast
across the Atlantic Ocean towards Europe. The heat from the Gulf Stream
keeps much of Northern Europe significantly warmer than other places
equally as far north.




Differences
in water density, resulting from the variability of water temperature
(thermo) and salinity (haline), also cause ocean currents. This process
is known as thermohaline circulation. In cold regions, such as the North
Atlantic Ocean, ocean water loses heat to the atmosphere and becomes
cold and dense. When ocean water freezes, forming sea ice, salt is left
behind causing surrounding seawater to become saltier and denser.
Dense-cold-salty water sinks to the ocean bottom. Surface water flows in
to replace the sinking water, which in turn becomes cold and salty
enough to sink. This "starts" the global conveyer belt,
a connected system of deep and surface currents that circulate around
the globe on a 1000 year time span. This global set of ocean currents is
a critical part of Earth's climate system as well as the ocean nutrient
and carbon dioxide cycles.



Other shocking scientific discoveries:



  • Record snow and rains cause floods. 
  • Record snow causes reservoirs to fill. 
  • Long droughts before the use of petroleum products and fossil fuels caused deserts to form. 
  • The Earth is warmer in the summer and colder in the winter. 
  • People in underdeveloped countries, where fossil fuels and petroleum products are not used as much, die younger. 
Scientific results too often relate to how much money scientists can extract from the government and other sources.



The
climate has always changed naturally throughout millions of years of
history.  Temperatures have risen and fallen.  Ice has come and
gone.  Sea levels rise and fall; droughts come and go.  Floods
happen.  Storm activity fluctuates.  All this has occurred prior to
humans and fossil fuels and has no correlation with whether CO2 has been
higher or lower.




The
indoctrination, supported by almost all the media, is dangerous to our
freedom.  People must be completely ignorant or naïve to believe that
politicians, bureaucrats, and scientists can control temperatures, sea
levels, and storm activity forever if we just hand over trillions of our
hard-earned dollars to them.  The only thing that does is make the very
wealthy D.C. area richer than it already is.




Do
we believe that the government can control the temperature on the sun,
the rotation of the Earth, the ocean currents, and the winds in
Antarctica?  Please explain how.  The politicians who promise they can
control the climate couldn't even keep their promise that we could keep
our doctors and our health plans and that our premiums would go down if
we just let them pass Obamacare. 




Isn't
it time journalists used their brains, common sense, and logic instead
of just repeating what they are told to push an agenda?  Wouldn't the
public trust them more?

Friday, June 14, 2019

Can We Arrest Global Warming By Yanking CO2 From The Atmosphere?

Can We Arrest Global Warming By Yanking CO2 From The Atmosphere?



Can We Arrest Global Warming By Yanking CO2 From The Atmosphere?











FILE - In this July 27, 2018, file photo,
the Dave Johnson coal-fired power plant is silhouetted against the
morning sun in Glenrock, Wyo. Jeremy Grantham, a British billionaire
investor who's a major contributor to environmental causes, will fund
carbon-capture research in Wyoming, the top U.S. coal-mining state.
Wyoming's Republican governor, Mark Gordon, and the carbon-capture
technology nonprofit Carbontech Labs announced Thursday, March 28, 2019,
they're providing $1.25 million to help researchers find ways to turn
greenhouse-gas emissions into valuable products. (AP Photo/J. David Ake,
File)
ASSOCIATED PRESS
When the National Climate Assessment was
released late last year, it said that the risk of catastrophic climate
change was such that new tools had to be created to literally yank CO2
from atmosphere — to sequester it from ambient air. But what is being
done to achieve this and is this technology feasible?
Study after study has concluded that
climate change is primarily human-induced and that global average
temperatures are the highest they have ever been. And if the level of
heat-trapping emissions is not brought down, it will affect every
dimension of human life — from energy production to water availability to infrastructure development.
“The emitting has gone on so long and has
reached such scale that to avert a 2 degree rise, we need carbon
capture and sequestration to be implemented across the globe,” says Sam
Feinburg, chief operating officer and executive director of Helena that serves as a braintrust to solve climate issues — one that brings together financial, intellectual and political capital.

“We used to ask if this is feasible,” he
continues, in an interview. “Now we know it is imminently feasible but
the bigger question is, ‘will we get there in time.’ “It is very
important that the world embrace this at scale. It is embryonic now. It
will get there. We will accelerate that timeline.”
The 2018 National Climate Assessment
— produced by 13 U.S. agencies — said that to keep temperature rises to
2 degree Celsius and 3.6 degrees Fahrenheit by mid century, public
policy must help facilitate the development of low-to-no-carbon fuels to
power utilities and vehicles.
For those who think the economic cost of such a transition is too high,
think again: The price of inaction, according to the authors, will be
half-trillion dollars a year.
Helena is specifically working with Swiss-based Climeworks to scrub CO2 from the atmosphere — after it has left the smokestack. Climeworks’ goal is
to remove 1% of the world’s total annual CO2 emissions by 2025 — or 10
billion tons of CO2 each year by 2050. Right now, the CO2 that the
company is capturing is re-used for plant life. But if it is to have a
real impact on mitigating the effects of climate change, that CO2 must
be sequestered underground. And that’s hard right now because it cost
$600 to remove a ton of CO2. But Climeworks says it will get that down
to $100 a ton in a few years.


In this May 31, 2018, photo, a solar panel array collects sun light with the Fremont, Neb., with a power plant seen behind it. Solar energy is gaining traction in a small but growing number of Nebraska cities, but the technology still faces a number of obstacles that is keeping it from spreading faster. (AP Photo/Nati Harnik)


In this May 31, 2018, photo, a solar panel
array collects sun light with the Fremont, Neb., with a power plant
seen behind it. Solar energy is gaining traction in a small but growing
number of Nebraska cities, but the technology still faces a number of
obstacles that is keeping it from spreading faster. (AP Photo/Nati
Harnik)
ASSOCIATED PRESS
Catalytic Effect
“The technology needs a catalytic effect: folks must be able to bet on it,” says Henry Elkus, chief executive of Helena,
in an interview. “There is an amount of research and development that
needs to happen to drop the price of a ton of carbon to reach
feasibility. Helena can facilitate that time span.
“We have seen this with solar,” he
continues. “There was a time when it was not viable and now it is.
Carbon capture deserves to be adopted.”
Helena has been working with Climeworks
for two years, providing it with the experts who give consulting and
scientific advice — without taking a financial position. The company, in
fact, has been listed as a “Top 20” in the field of carbon capture,
along with: ExxonMobil, Occidental Petroleum, General Electric,
Mitsubishi Heavy Industries, Schlumberger, Royal Dutch Shell and NRG
Energy. Meanwhile, two of Climeworks competitors that are also focused
on pulling CO2 from the atmosphere are listed: Global Thermostat and
Carbon Engineering.
To be clear, Helena is focused on real
climate solutions — ones that also include the increasing use of
renewable energies. To that end, the Bloomberg Foundation
has just launched a “Beyond Carbon” campaign that is financed with a
$500 million investment. It will work with state and local organizations
to pass laws to get to 100% clean energy by, in part, expanding
low-carbon transit, cleaning up buildings and promoting low-carbon
manufacturing.
The same foundation has worked to close
the nation’s coal plants, with 289 of the 530 being mothballed since
2011. It aims to close the rest by 2030. But it says that replacing that
fuel with natural gas will do little to dent the carbon threat over the
long term, meaning that such energy should be replaced with cleaner
alternatives.
Natural gas,
though, is used to firm up wind and solar plants when the weather does
not permit. And it has helped accelerate the growth of those fuels by
preventing outages. That means that carbon capture is a technology that
should not be forsaken — especially because this country will continue
to use natural gas while the developing world will continue to rely on
coal.
“The goal is to arrest global warming,”
says Helena’s Feinburg. “It is about getting carbon capture to a
commercial and industrial scale. We are about accelerating the growth
curve and adoption of the technology.”
Carbon capture
is an expensive tool that will require new resources to bring it to
fruition. And while many will argue that those monies would be better
targeted to prevailing cleaner fuels, the reality is that the world will
still depend on fossil fuels. The development of the technology is
henceforth unavoidable if we are to meet the challenges presented by
climate change.

Great Lakes Reveal a Fatal Flaw in Climate Change ‘Science’

Great Lakes Reveal a Fatal Flaw in Climate Change ‘Science’ 



Great Lakes Reveal a Fatal Flaw in Climate Change ‘Science’

I & I Editorial





Lake Erie and Lake Superior — two of the five
that make up the Great Lakes — broke records for water levels this May.
Lakes Michigan and Huron could follow suit.
Naturally, climate change is getting the blame. “We are undoubtedly
observing the effects of a warming climate in the Great Lakes,” says
Richard Rood, a University of Michigan climate scientist.





But just a few years ago, climate scientists were insisting that a warming climate would cause water levels to decline.





In 2008, Science Daily
reported on a study that attributed the decline in Great Lakes water
levels to global warming. The researchers who conducted the study said
that the drop “raised concern because the declines are consistent with
many climate change predictions.”





In 2009, Columbia University’s Earth Institute
informed us that “most climate models suggest that we may see declines
in lake levels over the next 100 years; one suggests that we may see
declines of up to 8.2 feet.”





In 2011, the Union of Concern Scientists said
that “scientists expect water levels in the Great Lakes to drop in both
summer and winter, with the greatest declines occurring in Lakes Huron
and Michigan.”





In 2013, the Natural Resources Defense Council said that “it’s no secret that, partially due to climate change, the water levels in the Great Lakes are getting very low.”





That same year, Think Progress reported
that “Several different climate models for the Great Lakes region all
predict that lake levels will decline over the next century.”





Since the Great Lakes account for 21% of the world’s surface fresh
water, these stories were all wrapped in doom-and-gloom scenarios about
the impact on drinking water, shipping, recreation, and so on.





The very next year, however, water levels started rising.





So what are scientists saying now? Simple. They’re now claiming that the fall and rise of Great Lakes’ water levels are due to climate change.





“Climate change is driving rapid shifts between high and low water levels on the Great Lakes,” is the new “consensus.” 





The truth, of course, is that water levels in the Great Lakes vary
over time. And, as a matter of fact, they varied far more in the past
than they do now. A U.S. Geological Survey notes that “prehistoric levels exceed modern-day fluctuations.”





It says that “Prehistoric variations in lake levels have exceeded by
as much as a factor of 2 (that is, more than 3 meters) the 1.6-meter
fluctuation that spanned the 1964 low level and the 1985-87 high
level.”





And, as anyone who’s ever lived near the Great Lakes knows, the lakes themselves were formed
in the wake a massive change in the earth’s climate — when the glaciers
receded at the end of the Ice Age roughly 14,000 years ago.





So if the lakes’ huge fluctuations in the past weren’t caused by
mankind’s burning fossil fuels, why are scientists so convinced that the
far more minor changes happening today are?





The reason is simple. Climate scientists can blame anything they want
on global warming. The climate models are imprecise enough that no
matter what is happening they can point to it as proof that man-made
climate change is happening. Too much rain, too little rain, bitterly
cold winters, mild winters, more snow, less snow, rising water levels,
falling water levels — they can attribute “climate change” as a cause of
it all.





But if nothing can disprove a theory, and every event, no matter how
contradictory, is proof that the theory is valid, is that really
science? Sounds more like a religion to us.