Overall, the survey found that 54% of the public are “comfortable and prepared to support” Trump as president. That’s down 2 points from when he took office in 2016. Some 41% are not comfortable, up 5 points from 2016.
The survey of 1,000 people nationwide was taken Dec. 5-8. It has a margin of error of +/- 3.1%.
The survey found 60% say deploying the military to the border to stop illegal drugs and human trafficking should be a 2025 priority for the new administration, with an additional 13% saying it should still be done but later in the term. The proposal is only opposed outright by 24%, including 51% of Democrats, 12% of independents and 3% of Republicans.
Support for raising tariffs is also more lukewarm, with 27% backing it outright and 24% saying it can be done later in the term. It’s opposed by 42% of respondents.
I’m sorry, this article isn’t worth the bits it’s saved in. Trying to read the national opinion and using just 1,000 people is bad science. At best this represents the (very small) portion of the population who would waste their time responding to a junk survey.
For good, reliable data, several orders of magnitude more than 1,000 and it would need to have the methodology and data published along with it.
Opinion polls in general are not reliable sources of information and the wrong approach anyway. Telling people that X% of their neighbors hold Y opinion is a well known and effective propaganda and marketing tool for influencing opinion and decision making.
Read what I wrote slowly again. I said Pew was the gold standard, said how many they polled in a recent survey as an example, and highlighted that they posted their data and methodology. I never said there was a minimum.
CNBC doesn’t provide any of their data, has no published methodology - this might as well be results from an online survey like Fox News does all the time.
Most of them would be from an academic source most likely. That kind of polling would be very expensive and time consuming. There probably aren’t commercial, short term polls with that level of rigor.
A 2020 study published by Berkeley found that the accuracy of election surveys (which are conducted similarly to opinion polls) was grossly exaggerated.
A 2018 Cambridge study says “the level of error has always been substantially beyond that implied by stated margins of error.”
Okay, since that sort of polling would be very expensive and time consuming and people would like to know the opinions of their fellow citizens in aggregate, what would you suggest?
Nothing. That information is not actually useful for most people. But I fully acknowledge that’s just my opinion.
A better solution would be different metrics for different topics. Consumer faith in the economy can be measured by spending, especially if that data could be broken down by demographic. That data absolutely exists, whether businesses would make it public is abother thing entirely.
The results of the election, especially given it was less than six weeks ago, is a much more compelling data point for how Americans feel about the president elect and his policies. Just under half of all Americans voted, so that’s a pretty decent sample.
The “best solution” would be for news organizations to pool resources and do it more reliably. That would mean no more flash polls or opinion polls, and favor longer term tracking of public sentiment.
Social media companies also have much more robust sets of data that better encapsulate public opinion, they could share that quarterly or even just sell reports to news outlets.
But polls are so unreliable and so many people blindly trust and believe them, eliminating that entire class of reporting would be preferable to continuing to publish and circulate that information.
Have you seen the final count of the vote (which was released a week or two ago)? Neither candidate won the popular vote (Trump 49.9%, Kamala 48.4%) which was not predicted by the polling. They were projecting a very close race, but everything else was wrong.
The people conducting the polls use a technique called random sampling to select candidates from a pool that gives more accuracy. But it’s not perfect and the academics think it’s sus too (I dropped a few studies in another comment).
You have to dig for it a bit, but the actual survey can be downloaded (as a pdf) from CNBC. Their data show bias. The data over-representing people over 60. Their education numbers are biased towards the less educated. Their racial numbers are biased (slightly) towards white people. Their income numbers are biased towards wealthier people.
Their voting data shows a major bias towards people who voted, but I’m actually okay with that one, at least in the context of the political reporting. The people who didn’t vote’s opinion on the political situation in the US is not as important as the people who voted. As part of an economic survey about holiday spending that also asked questions about the recent election, it’s not so great though.
No one of those biases would be a big deal, but in totality they add up to a significant and misleading bias that favors the opinions of older, white, middle-class respondents who vote and graduated high school (but attained no further education). That demographic is also the biggest consumer of CNBC content, so the reason for the bias seems fairly obvious. And again, as an “All-America Economic Survey” that’s not really a big deal, especially considering the massive gaps in the data they polled. But as a barometer for political opinion it skews the data in very important and meaningful ways.
I’m sorry, this article isn’t worth the bits it’s saved in. Trying to read the national opinion and using just 1,000 people is bad science. At best this represents the (very small) portion of the population who would waste their time responding to a junk survey.
That’s pretty standard size for a national opinion survey. How large do you think they’re supposed to be?
For good, reliable data, several orders of magnitude more than 1,000 and it would need to have the methodology and data published along with it.
Opinion polls in general are not reliable sources of information and the wrong approach anyway. Telling people that X% of their neighbors hold Y opinion is a well known and effective propaganda and marketing tool for influencing opinion and decision making.
It’s essentially institutional peer pressure.
I have never seen any sort of poll of Americans several orders of magnitude more than 1000. Can you give an example please?
Pew Research is pretty much a gold standard. In a recent survey on Ukraine they polled almost 10,000 https://www.pewresearch.org/short-reads/2024/11/25/wide-partisan-divisions-remain-in-americans-views-of-the-war-in-ukraine/
Also they post their dataset and methodology. Any poll/survey that doesn’t do that is reasonably suspect.
This Pew Research?
The one with this page entitled, “How can a survey of 1,000 people tell you what the whole U.S. thinks?”
https://www.pewresearch.org/short-reads/2017/05/12/methods-101-random-sampling/
Did you even watch the video? Do you not see the difference between what Pew does with a 1,000 people and what fucking CNBC does?
I thought the argument was that you couldn’t get an accurate sample size of Americans with just 1000 people, not that CNBC’s methodology was wrong.
Read what I wrote slowly again. I said Pew was the gold standard, said how many they polled in a recent survey as an example, and highlighted that they posted their data and methodology. I never said there was a minimum.
CNBC doesn’t provide any of their data, has no published methodology - this might as well be results from an online survey like Fox News does all the time.
Most of them would be from an academic source most likely. That kind of polling would be very expensive and time consuming. There probably aren’t commercial, short term polls with that level of rigor.
A 2020 study published by Berkeley found that the accuracy of election surveys (which are conducted similarly to opinion polls) was grossly exaggerated.
A 2018 Cambridge study says “the level of error has always been substantially beyond that implied by stated margins of error.”
Okay, since that sort of polling would be very expensive and time consuming and people would like to know the opinions of their fellow citizens in aggregate, what would you suggest?
Nothing. That information is not actually useful for most people. But I fully acknowledge that’s just my opinion.
A better solution would be different metrics for different topics. Consumer faith in the economy can be measured by spending, especially if that data could be broken down by demographic. That data absolutely exists, whether businesses would make it public is abother thing entirely.
The results of the election, especially given it was less than six weeks ago, is a much more compelling data point for how Americans feel about the president elect and his policies. Just under half of all Americans voted, so that’s a pretty decent sample.
The “best solution” would be for news organizations to pool resources and do it more reliably. That would mean no more flash polls or opinion polls, and favor longer term tracking of public sentiment.
Social media companies also have much more robust sets of data that better encapsulate public opinion, they could share that quarterly or even just sell reports to news outlets.
But polls are so unreliable and so many people blindly trust and believe them, eliminating that entire class of reporting would be preferable to continuing to publish and circulate that information.
Polls seemed pretty reliable when it came to the election.
Have you seen the final count of the vote (which was released a week or two ago)? Neither candidate won the popular vote (Trump 49.9%, Kamala 48.4%) which was not predicted by the polling. They were projecting a very close race, but everything else was wrong.
Yeah, I’m not sure how anyone can see “1,000 people accurately represent 330+ million people” and say say, “yep, sounds about right, that does.”
The people conducting the polls use a technique called random sampling to select candidates from a pool that gives more accuracy. But it’s not perfect and the academics think it’s sus too (I dropped a few studies in another comment).
You have to dig for it a bit, but the actual survey can be downloaded (as a pdf) from CNBC. Their data show bias. The data over-representing people over 60. Their education numbers are biased towards the less educated. Their racial numbers are biased (slightly) towards white people. Their income numbers are biased towards wealthier people.
Their voting data shows a major bias towards people who voted, but I’m actually okay with that one, at least in the context of the political reporting. The people who didn’t vote’s opinion on the political situation in the US is not as important as the people who voted. As part of an economic survey about holiday spending that also asked questions about the recent election, it’s not so great though.
No one of those biases would be a big deal, but in totality they add up to a significant and misleading bias that favors the opinions of older, white, middle-class respondents who vote and graduated high school (but attained no further education). That demographic is also the biggest consumer of CNBC content, so the reason for the bias seems fairly obvious. And again, as an “All-America Economic Survey” that’s not really a big deal, especially considering the massive gaps in the data they polled. But as a barometer for political opinion it skews the data in very important and meaningful ways.
The fact that the polling outfit is used primarily by Republican organizations is suspect, IMO.