Should be required watching for all those school systems that view music education as an optional extra:
I think this is an issue of neuroscience over-generalizing everything. Problem with studying the brain is that not all the brains are the same. They can localize among families, races, and but they can't really have a "standard" which is what all "new" sciences struggle to have. I find it hilarious when they falsely apply statistics. For example, the location of an average person is at the centre of earth. Also, they classify "musical instruments", I'm pretty sure what goes on in a drummer's head is pretty far from what goes on in a singer's head.
They often overstate "why" this is good without properly qualifying their statements to proper target.
I do believe that playing music trains you to have a better memory(it's funny because I can literally re-write a page in a piece that I glanced at 2 weeks ago, but I cannot remember my next door neighbour's name. I walk past her almost daily), fine motion. Creativity, not unless you are seriously playing and have been stumped to places where you have to improvise. For example, while I'm playing a piece for a friend, I mixed up an up and down bow, for the rest of the piece I used alternative bowing until staccato then I was back on track.
As for applying learning music for other activities, during my first intense exposure to calculus(3rd year University level), I had to generate geometrical shapes with math, this was very easy for me, because I could literally picture it in my head, later on I was able to enhance this ability using classical music. In order to generate shapes in my head, I need to constantly engage my mind into it. Now, let's say, if I wanted to generate a funnel, think of up scale legato. The signal is already there. It just made it much easier to generate shapes. Once I get into signal analysis and sampling, classical music, especially string music became a LOT more applicable because you can literally represent almost all math equations with string music, the question really is if I can perform it(most often no).
Bottom line, I agree with the video to an extent, but a bad representation of the message they're trying to convey. I think this is because their target audience is for non-musicians to get into the game.
I agree with Scott. I don't think it's helpful to any cause, but I would say especially to a progressive cause, to base claims on "science" that is either unsound or not even science. It's awfully difficult to imagine a controlled study, with a sufficiently large and diverse population, to investigate the influence, if any, of musical education on intellectual development, but that's just too bad. Just because the experiment is difficult and expensive doesn't mean you get by with pseudoscience, unfounded claims, and "publication" in the form of "TED Talks" that are intended for popular consumption and are not subject to the normal process of peer review. Scott clearly demonstrated just how easy it is to blast gaping holes through something like this.
After I assume power, the first thing I'll do is teach a course to high school freshman called "why people believe weird sh!t" with Michael Shermer, the editor of Skeptic magazine as guest lecturer. Part of our problem in this country is that people are willing to buy into anything which has the slightest ring of plausibility, whether it's Scientology, homeopathy, colonic cleansing, paleo dieting, or the promise that flooding the country with guns will keep us all safer, or that processed food that says " heart healthy" with a cutesy logo is good for you, or selling Amway, or taking expensive, unproven supplements...you get the idea. That's why we see such interest in nutball presidential candidates--too many are unable to distinguish good policy from bad, or facts from fiction.
Sorry about the rant. But anyway, music should be taught simply because it has its own intrinsic value. Teach people to play music, and think skeptically.
after you assume power... Phew thank god I'm in Canada. I get so amused by "Scientists" claiming things. Much less people who buy into it. The best is at a conference, I meet "established scientists" making claimants with false premises and I get to see their face go red when I ask them valid questions. I mean... apparently I'm an agent of Satan trying to open up a gate to hell. I'm working on detector upgrades for CERN.
Just like anything, I don't think music is something to be thrusted and encouraged with false promises upon someone. I think they should better expose it to the public and let people choose. For example, imagine if they offered free fractional instrument rental for younger children.
I have to confess I'm taken aback at the level of vitriol being directed at this little film.
Everyone seems to be assuming a priori that all neuroscience in this field is worthless without even reviewing the papers, while apparently I'm a gullible "Scientologist nutball" for even considering that peer-reviewed research in major journals might have any validity at all.
If you want to engage with the evidence instead of ad hominem rants here are the references:
http://www.anitacollinsmusic.com/tedresearch/
And where did I propose thrusting music on people with false promises?
In the UK the percentage of kids learning an instrument has fallen 50% in a generation. School systems facing cuts are increasingly seeing music as expendable. Any well-founded evidence for the broader developmental benefits of music education should surely be welcomed by those embattled educators working to reverse this trend. I'm not proposing forcing anything on anyone, merely that schools should be offering the choice to families without the income to fund private lessons.
I was fortunate to have parents able to pay for music education and attend a high school with one of the best music programmes in the UK. It has given me a lifetime of pleasure and at times some income too. I feel passionately that this opportunity should be available to all our kids, rich or poor.
An inspiring exception to the trend is the Shetlands, an isolated group of islands in the far north of Scotland with a strong fiddle tradition. Through the agency of Shetland Arts, every child on the island has access to a free fiddle and free tuition. Obviously, there is no forcing involved. But a high percentage of kids take up the offer, and the result is a remarkable flowering of trad and classical fiddling in the community. You can even study for vocational and degree-level qualifications without leaving the islands, and many kids have gone on to make careers in music. This is what can be achieved by educators convinced that music should play a central role in a rounded education.
Interesting, two people so far have said they have trouble remembering names. That's something I do too! So that's three! I wonder if there's a pattern... ;)
Geoff,
Rants are my specialty.
But anyway, things like this can affect public policy. An example: a few years back, the governor of, I think, Georgia, declared that research showed that infants and unborn fetuses would be smarter and do better in school if parents listened to classical music. So public funds were used to distribute CDs. I think it was discredited later, but the fact remains that politicians and the public latch onto poor science, construct policies that sound reasonable, and spend a lot of money that could have benefited people in more tangible and immediate ways. If you really want to help working parents, don't give them CDs--subsidize daycare or mandate more paid parental leave.
Or... Have a real scientist representing scientific research. Disqualifying "new" sciences. Or, you end up like Canada, gut the national research council(I've met quite a few 40-some years old valuable scientists "retiring") and turn it into more of an engineering firm. I just think any "scientific" research must present their data and allowed to be criticized by other scientists before making any claim. Which is a strength and weakness of CERN, plagiarism is a nasty thing.
I recall growing up in Korea and my parents buying into pseudo sciences, They were even poisoning with "medicine" to get rid of my asthma, which concluded with pneumonia. I mean, if we look back into even North American history, people took radon bath for radioactive goodness, and etc. the point is, that I think too many people abuse the word "scientific" nowdays. I think some should even been actively discredited.
Ah well - everyone is still dismissing the research out of hand without actually reading it.
"I just think any 'scientific' research must present their data and allowed to be criticized by other scientists before making any claim"
Haven't you just described the peer-review process? The thing that strikes me about this research is the unusually high quality of the publications it appears in - mostly in the leading journal of its field. One of the papers was even published in Nature, the most prestigious and cited journal of all.
For some reason you're all convinced that the editors and reviewers in these trashy journals don't have the slightest idea what they are talking about, and that anyone who might give any weight to this research is a credulous rube.
But no-one has made a single substantive point to back up this rather odd position...
Frieda, I agree with that. I remember reading an article, it states by presenting the speed of light since the 1800s to current then claiming that the speed of light has been decreasing. While the truth is that the instrumentation has been getting better which means better accuracy. It's here:
http://phys.org/news/2014-09-curious-case-fluctuating.html
It did make itself to Ted talk as well. The difference is that this article actually presents data although data works against the author.
The last figure, figure 6 actually shows how the measured speed of light has been converging since late 1900s. It's most amusing here:
"But maybe it just so happened that the speed of light fluctuated until about 1960 when laser interferometry was developed..." The author effectively demonstrates that what instrument was missing causing fluctuation in data.
Articles regarding neuroscience and psychology research.. I am yet to see read with proper data presentation. A scientific approach doesn't come in multiple choice format.
Someone mentioned articles in Nature. There weren't any articles cited in the video. A brief bibliography would have been welcome.
If musicians are so smart then how come you have top level violinists honest-to-God believing that their tone depends on how many times the string is wound around the peg or that borrowing someone else's rosin once will ruin your bow?
I see institutions, not work nor data.
The point of citing a literature is to use the data presented in the literature to support a statement, which requires direct presentation of the supporting data. If an author cites a conclusion made from a literature, it's as valid is claiming that "this is the case, because he/she said so".
Which is totally valid in artistic world, because suppose if my luthier told me something about my violin, that's good enough for me.
When it comes to a scientific article if I conclude with E=mc^2, because Einstein said so. I must either show the derivation of the equation, or present experimental data to conclude with it. I'm just dissatisfied by how the above video presents "studies have been done to conclude this".
The author can use the conclusion of other articles to build premises, but cannot use the conclusion without showing what that article's premises are. If they said "according to this study, a is true, and according to that study, b is true, and using a and b, we can show that c is true". Instead the author straightly went into "c is true. Therefore you should do this. Here are the references"
So what you're proposing is that every science and nature documentary should support each claim with scientific references right inside the broadcast?
No-one has ever done that in the history of broadcasting, and for one simple reason - no-one would ever watch it.
I'd be interested in evidence-based refutations of the claims made in the video. But instead we're getting wild general attacks on scientists and broadcasters. Where does that get us, exactly?
As for false evidence about the speed of light making it into a TED talk, I think you're talking about Graham Hancock's TEDx talk. TEDx is a mechanism for regional TED events with a lower level of editorial input. TED were so concerned about the inaccuracies in that talk that they controversially took it down. In general, TED has a good reputation for fact-checking.
Well Geoff, you make a good point. I agree, if broadcast must present all the scientific basis to present something, it will get nowhere. Instead of being able to watch an 8 minutes presentation, we'll be watching 8 years of documented research.
I just do not like the idea of broadcasting "The research has shown 'blah'", I would much prefer "this study has suggested so far 'blah'". Which is somewhat equivalent to this video constantly using "may".
I often watch documentaries on evolution, and cosmology. They often do not include any supporting data other than stating "this fossil" or "this star". They however build up to conclusions from beginning to the end, even when the beginning is inconclusive and somewhat controversial.
I just find the backwardness of some broadcast as this, "Music may allow this", conclusion is drawn. Then a, b and c contributes to this "fact" and etc. I don't disagree with the fact that music would benefit a lot of people, would help people many different ways. I just think it's just a bad way to present an idea or studies.
Basically if I assume a, b and c, then d logically follows. This is a good way to present an idea. If I say d is true, and a, b, and c are true because d is true, I'll probably be embarrassed, because I'd feel like this:
https://www.youtube.com/watch?v=_YgPBmfpLSM
Geoff wrote, "So what you're proposing is that every science and nature documentary should support each claim with scientific references right inside the broadcast?"
Well, yes. Depends what you mean by "broadcast." A video that's posted on the internet can have a link inside it that says "click here for a list of scientific articles on which this video is based."
Isn't that supposed to be one of the great things about the internet, that we can hyperlink all of this "information" that's "out there" on The Google?
Anyone who cheerfully sits through five seconds of an ad for car insurance will not be put off.
Okay so it's pretty interesting. I looked at the articles that were linked on Anita Collins's web page. Truthfully I don't have time to read them all -- certainly not now, with my grades due to the Registrar in a couple of days -- so I scanned through the titles until I saw one that seemed to be making a claim that seemed relevant to the overall topic of the video.
The article I chose is entitled "A Little Goes a Long Way: How the Adult Brain Is Shaped by Musical Training in Childhood." The article is in "The Journal of Neuroscience," to which I have full-text access through Virginia Tech Libraries. This is a top-tier journal with an impact factor close to that of "Physical Review Letters." They're not going to publish junk.
Here are a few observations after spending about 10 minutes with this article:
1. The title uses the term "Adult Brain" but the authors measured response in the brainstem to audible frequencies being played through headphones, arriving essentially at a signal-to-noise type figure of merit. Yes, I realize the brainstem is part of the brain.
2. The size of the sample is 45 people, divided into three groups having zero, a few years, or several years of childhood musical training.
3. No distinction is made between private lessons and school-class type training. (About a third of the subjects studied piano, which I don't think is offered in many public school classroom environments.) My concern there is that private lessons may be associated with family environments in which there is more music being played generally, on the radio, by siblings, etc. So I wish they had controlled for that, but they'd probably need a much larger sample size then.
4. The study suggests that even a year of "musical training" is significantly better than none in terms of the response of your brainstem to audible signals. There isn't any correlation between the number of years of lessons and the response. One year is as good as 11 (Figure 1C). One wonders therefore whether two months would be sufficient, or two weeks. (If you can play Twinkle, is your brainstem good to go?)
5. Even though the authors seem to be wanting to make the case that a year or so of musical education in school has a lasting effect, the best correlation (r=0.41, fairly pathetic as correlations go, but apparently acceptable to the reviewers) showed that the effect slowly wears off once you stop "musical training."
6. A citing article (one of 9) was entitled "Listening to the Brainstem: Musicianship Enhances Intelligibility of Subcortical Representations for Speech" (Journal of Neuroscience, 2015, 35, 1687). This article reported a study in which "brainstem responses were recorded in response to a five-step synthetic vowel continuum" using fourteen (14) subjects, all students at the University of Memphis.
Please understand that I am not attempting to discredit these findings. I'm only trying to underscore the **extremely** incremental nature of neuroscience research and the limited (tiny) scope of the studies cited. Above all I want to point out the grave danger of accepting a very broad-sounding article title at its face value ("shaping the brain" sounds a lot more dramatic than what was actually measured) without asking the most important question in science: "How do they know?"
"Why Most Published Research Findings are False" are we really getting our panties in a bunch about a TED talk? Here's a popular article about the issue and a popular source about the subject of the article, Dr. John Ioannidis.
Compared to this issue,If policy has nothing to do with scientific evidence and everything to do with lobby, what's wrong with garnering popular support for our field? What's the alternative? And where is the grave danger? If a mirror can cure phantom-limb syndrome, does the patient care about the incremental nature of the supporting research? Apart from feeling a little sheepish, where's the grave danger in sitting in front of a mirror box, staring at a reflection of your remaining arm? If parents suddenly give their children private music lessons because of suggestions of shaping their children's brains, does that present a grave danger to the well-being of the child or the family? As far as scientific discovery and research goes, I'd have to say knowledge from neuroscience has got to be the furthest thing from dangerous, much less grave. Compare to the potential danger of other scientific endeavours... any of them... pick one... how about physical chemistry, or nutrition, or medicine?
Well Paul, I have much to say, but I will summarize to thanking you for doing the actual research. The fields of neuroscience and psychology have almost always overstated their findings from small pool of data and abusing words such as random or scientific. Again, this is an issue of them being unable to find a standard to found upon properly.
They cannot have a "standard" or "average" mind/brain/person etc. because there isn't one. They can however say "average person in Canada" etc. If they work out of a tiny pool, 45 people is that a joke?! Physicists waited over 2 years to confirm Higgs particle and made sure that over 99.9993% of the data converged. 45 is not even 1% of the world population!
I admit that I kind of got a lump in my stomach when I saw the data through which the regression lines were drawn.
It's just as repulsive to abuse scientific findings in support of a progressive cause. The "grave danger" is that science becomes, in the minds of the general public, a false basis. Are we there yet?
Consider the parents deciding between music lessons and figure skating because they can only afford one for their child. Is there no harm if they choose the former for false reasons?
Jeewon wrote, "Knowledge from neuroscience has got to be the furthest thing from dangerous." I'm not so sure. Haven't we been pretty close to the brink with studies of intelligence as a function of race?
"Are we there yet?"
I think we're far past that point. And I don't think science is innocent in the matter. Did pure research ever exist? Is science ever not used and abused for a cause? Can we so easily forget climate-change-denying scientists? What about DDT, fire-retardants, xenoestrogens? To be sure, Edward Bernays and the field he invented probably had a lot to do with the way information is now disseminated to the public and how it responds. But what pure scientist develops Napalm for it's "ability to penetrate deeply...into the musculature, where it would continue to burn day after day."? To me it seems naive to think science is ever pure finding, with no bias or agenda behind it's presentation. I don't believe science is a false basis, but can the public truly trust any scientist, even with all their seemingly objective data (which the vast majority of people have no idea how to interpret) at face value? Who can we turn to? I'm really asking.
But what I'm really trying to point out is that there is danger and then there is danger. Drought, famine, war. That to me is danger. Music lessons v. figure skating lessons? C'mon. Talk about first world, and increasingly, upper 1% problems. So you can rant all you want about truth and be all indignant about TED talks and popular scientific books, but show me a real solution to the increasing knowledge gap between the public and experts and I'll listen. I think the onus is on scientists.
To jump from neuroscience to race and intelligence you'd have to have a scientist make that bigoted leap wouldn't you? So yeah I guess science is really not dangerous, only scientists.
not scientists, politicians. Science is about gaining knowledge and information, it's up to people to choose what to do with gained.
I find it amusing how limited statistics background some disciplines use. Linear regression applies for Linear data only. If it is time dependent, Poisson, and especially with people and brain, we must take in account many different variables. Which means multi-dimensional statistics. As soon as I read an article and present data in linear regression with more than one variable, I toss it in the recycle bin.
Neuroscience is often not dangerous because of the performed research, it's dangerous because of how prematurely their conclusions are drawn. That's that distinguishes science to "science". Scientists study a small concept for over decade before it can be granted a conclusion. Unfortunately as human knowledge progresses, some scientists would not see the conclusion to their research(cosmologists for sure) in their life time. The fact that scientific communities have a HUGE collaboration is because each individual works on teeny-tiny part of the huge research effort which leads to a breakthrough. Whence some conclusions are drawn, someone else can build upon that research and do the same thing all over.
What I find in neuroscience is that they draw quick conclusions in something that's supposed to take for a very long time. Here's the thing, if I were to make a statement that "this is going to happen". The model must be able to support itself as it becomes applicable to other subjects. In modern generation, we can certainly simulate it using computers as well as select random pool to experiment with and etc.
I see the dangers in neuroscience and psychology active in the society, medications/treatments for psychological illness. The medications are treatments, not cure. Rather than fixing the problem, the patients grow dependent of the medications. There are many different side-effects of medications used, and it also differs in intensity and categories from person to person, because each person is different. Each person is so then supposed to find the "right medication" which works for them.
This is a representation of incomplete research. If something cannot be generalized and applied to everything, the model is flawed, or must be localized. "This medication works for person A", "this medication works for person B", rather than "this medication works".
Returning to the original topic, playing music helps people neurologically, yes, I agree. How and why? It activates many different parts of the brain when a person performs music. Agreed, but so does solving a complex maze. Is it the same way for all person? No!
Liz, two linear regressions can and should be convoluted into a 3D plot. Not to mention, the nature almost never linear.
Jeewon the question of musical education as we know it is a first world problem at the outset. I do see your point about the trustworthiness of science. As for me, I prefer the high road. I do suffer some consequences thereby, but why bore you with another first world problem like research funding.
It currently seems to fashionable to dismiss "first world problems."
We do live in the first world, right?
Besides, it's all relative...
Speaking of fashionable, it seems everyone is wearing giant glasses a la 1978. What's the deal with that?
Scott: Yes. To the parent making the decision between violin lessons and figure skating, at that time and in that circumstance, the decision is important. To them. Each activity proffers tangible and intangible benefits including many (even most or all) that have never been (and will never be) proved beyond reasonable doubt. Then along comes someone telling you that musical education "shapes your brain" and suddenly you find yourself wondering if you should have your child vaccinated. Oh sorry ... what were we talking about again?
There is a distinction between TED and TED-ed. TED-ed videos are scripted by someone (usually a teacher) who wants to "teach" a particular topic. It's like a parent trying to teach his/her child the benefits of broccoli. Sometimes exaggeration is involved.
You guys do know it's a figure of speech, yeah? It doesn't actually mean the first world has no problems, as is painfully obvious to all, problems with dire consequences for us and the rest of the planet. But those real problems are not the ones referenced by the figure of speech. Yeah?
I like TED talks because it gives me hope there are some creative, smart, unjaded people left in the world with the drive and passion to possibly make a difference for our future. I don't look to TED talks, or the news, or any single source in making important decisions, and it's a bit condescending to suggest parents would make important decisions for their children based on one TED talk, or the result of some scientific study.
There are studies which link academic success with participation in sport. There's a study for every agenda under the sun isn't there, and so many conflicting results I'm pretty sure parents don't look to scientists in their decision making, not that I really know.
But we do need experts with some integrity, who can wade through the mass of conflicting 'truths' out there and help make sense of it for the lay people, to help parents make informed decisions, to influence policy. And how will they reach the public? Ask Bernays.
I enjoy TED talks too, for the same reasons you do. They're inspiring.
Greetings,t
I am wondering if the enjoyable ferocity of this debate is fuelled by all our bloated brain stems?
Since everyone involved is making much higher order statememts than your average jo bllogg who sits in front of the tv with a beer every night , dont you constitute a reasonable demonstration that the Ted premise is true?
Cheers,
Buri
And the viola must be better than the violin. Every time I play my new viola I can feel it in my brain stem ...
"...but why bore you with another first world problem like research funding."
Actually Paul, I don't think that is a 'first world problem' at all, but one of grave importance and global impact (over the long run.) I've heard things here and there from various scientists, but according to Ioannidis, "an obsession with winning funding has gone a long way toward weakening the reliability of medical research."
More from the article:
At the time, he was interested in diagnosing rare diseases, for which a lack of case data can leave doctors with little to go on other than intuition and rules of thumb. But he noticed that doctors seemed to proceed in much the same manner even when it came to cancer, heart disease, and other common ailments. Where were the hard data that would back up their treatment decisions? There was plenty of published research, but much of it was remarkably unscientific, based largely on observations of a small number of cases.Ioannidis was shocked at the range and reach of the reversals he was seeing in everyday medical research. “Randomized controlled trials,” which compare how one group responds to a treatment against how an identical group fares without the treatment, had long been considered nearly unshakable evidence, but they, too, ended up being wrong some of the time. “I realized even our gold-standard research had a lot of problems,” he says. Baffled, he started looking for the specific ways in which studies were going wrong. And before long he discovered that the range of errors being committed was astonishing: from what questions researchers posed, to how they set up the studies, to which patients they recruited for the studies, to which measurements they took, to how they analyzed the data, to how they presented their results, to how particular studies came to be published in medical journals.
This array suggested a bigger, underlying dysfunction, and Ioannidis thought he knew what it was. “The studies were biased,” he says. “Sometimes they were overtly biased. Sometimes it was difficult to see the bias, but it was there.” Researchers headed into their studies wanting certain results—and, lo and behold, they were getting them. We think of the scientific process as being objective, rigorous, and even ruthless in separating out what is true from what we merely wish to be true, but in fact it’s easy to manipulate results, even unintentionally or unconsciously. “At every step in the process, there is room to distort results, a way to make a stronger claim or to select what is going to be concluded,” says Ioannidis. “There is an intellectual conflict of interest that pressures researchers to find whatever it is that is most likely to get them funded.”
It seems funding is rather at the crux of the problem. And as Steven mentioned, our previous government crippled independent research here in Canada severely, mostly to muzzle evidence based findings on environmental issues (Harper's war on science....) So I'd be very interested in hearing about funding woes in your field.
Jeewon, don't get me started. It's WAYYY off topic and I rather not get into it in public.
I wouldn't want you to jeopardize your own career, but not even a glimpse? I guess this is not the proper forum, and your reluctance underscores just how dependent science is on non-scientific factors. So with respect to overreach, bias, falsity, faking it, the problem seems to be wide spread and systemic, subject to all the deficiencies of free-market capitalism and human fallibility, despite the scientific method and peer review, and not endemic to any one field. So doesn't that bring us back to 'market the hell out of your agenda until someone comes along to contradict your research, after which point it will continue to be believed and cited for a decade or more anyways?'
Medical research is not especially plagued with wrongness. Other meta-research experts have confirmed that similar issues distort research in all fields of science, from physics to economics (where the highly regarded economists J. Bradford DeLong and Kevin Lang once showed how a remarkably consistent paucity of strong evidence in published economics studies made it unlikely that any of them were right). And needless to say, things only get worse when it comes to the pop expertise that endlessly spews at us from diet, relationship, investment, and parenting gurus and pundits. But we expect more of scientists, and especially of medical scientists, given that we believe we are staking our lives on their results. The public hardly recognizes how bad a bet this is.
...
THOUGH SCIENTISTS AND science journalists are constantly talking up the value of the peer-review process, researchers admit among themselves that biased, erroneous, and even blatantly fraudulent studies easily slip through it.
...
Perhaps worse, Ioannidis found that even when a research error is outed, it typically persists for years or even decades. He looked at three prominent health studies from the 1980s and 1990s that were each later soundly refuted, and discovered that researchers continued to cite the original results as correct more often than as flawed—in one case for at least 12 years after the results were discredited.
I think I'll go back and watch some Star Trek reruns...
Star Trek reruns? Cool.
I've always wanted a student called Kirk. Then if they had a weak tone I could shout 'Kirk to bridge.'Kirk to bridge.'
Cheers
Buri
Im Klingingon to it.....
Qapla' Buri!
Jenny, I guess as with all First World Problems, it comes back to profit over knowledge, beauty, truth and justice. Now I'm getting really depressed. 'Tis the season. Anyone seen Force Awakens yet? No spoilers!
to humour you Jeewon, I'm using an x-ray source from 1992. Gave up using Oscilloscope from 1970s, our very capable technicians put together old parts to build a power supply for the x-ray.
In the mean while, Russians paid $15000 to a high tech company to build a machine which is far superior to what I'm working with.
7 months ago, we couldn't even afford a $300 software license upgrade. Thankfully, we've gotten much better funding this year(and also a huge jump in public interest in our research, we even had a museum representative coming in the other day to make an exhibition out of our labs).
Is funding hindering research? Oh yes. Does that mean we should/would do a half-finished job? No. Which is why my colleagues and I are in relative high demand(since we couldn't throw money at some issues, we used our heads, and employers/supervisors like that). An analogy I used to say is that I felt like my supervisor gave me a spoon to dig a hole to China, and he gives me a week to accomplish it. I also keep the undergraduate society in reach, so I can pick out volunteers. They gain experience, and have something for their CV and I get extra hands.
Do some other institutes do half-finished job? Yes, which is why we have so many different ways and levels of testing it. Which is why in a huge collaboration in Physics at least often take too long to publish small valid scientific articles. Again, with huge collaboration, even if one person publishes once every 10 years, with only 3653 researchers, we'll see daily publications.
Often other disciplines have much smaller collaborations and prematurely publish. This goes back to public attraction and funding. It's a vicious cycle. I see those people who knowingly publish what they aren't sure as if they are selling their souls, and I would rather not associate with them. I think it's safe to say that I personally choose to keep my distance from neuroscience and psychology.
Jenny, I don't know. I wouldn't be very surprised if that were true though.
"I personally choose to keep my distance from neuroscience and psychology."
And, by your reasoning, medicine, right?
I don't pretend to know anything about how research funding works, which is why I ask, but I was under the impression Ioannidis was talking about funding for new research. Is that where your operational budget comes from, for capital equipment too?
I can't imagine physics research changes as frequently as in life sciences, nor with the same kind of urgency. I'm not suggesting that's any reason for doing shoddy work, but given the ethical implications and complexity of working with living systems, and the pressure to produce results, I can understand why there's more error in certain fields. It seems, as with the financial industry, there needs to be more accountability than there is currently, more or better regulation. But how many meta-researchers like Iaonnidis are there, and what clout do they have. I don't know, it's exasperating!
As for the video above, I still think it's relatively benign compared to something like prozac, or stents, or even the food pyramid. Yes, belief based on secondary sources, but that's all most of us have. They say the research is still new, but it's interesting if overreaching, shows some plausibility and warrants further investigation to answer Scott's questions. (I admit I only just watched it, though I might have seen it before a long while ago.)
Geoff's original assertion was that the video should be of interest to school systems.
However, the population that has strings in schools different than those in serious study:
Students in string classes typically neither practice, especially over vacations and summer, nor take private lessons. So it's hard to conclude that this superficial level of study would lead to significant brain changes.
Then there are the serious students who take weekly lessons, attend concerts, practice during the summer, etc. Here's the problem with teasing out cause and effect: this group of serious students are usually from the socio-economic elite.their parents are professions from the upper middle classes who usually have graduate degrees, they can afford good instruments and lessons and summer camps. Anyone who has private teaching experience knows these kids, who are generally elite for many reasons: they get good grades, stay out of trouble, and spend their summers productively, and go on the college, often elite ones. If they quit the violin, they often end up in medical school or some such. Remember that this group, the academic elite, didn't just start their music studies early-- they also went to Montessori preschool, their parents were likely virulently anti-tv, and they take calculus over the summer.
In some ethnic cultures, it's taken to extremes, with students pushed mercilessly to excel at music, often with the sole intention of using to gain college admission. These students are pushed mercilessly at everything except sports.
So to summarize: it's hard to imagine public school strings having that much impact on the brain. But on the other hand, certain types of kids, usually studious, and at the highest levels already the product of huge socio-economic benefits, are drawn to play strings. They have self-selected.
How can one separate the effects of music on accomplished musicians when they are almost always the products of elite parenting from the womb?
Yes, I think music should be studied
1. For its own rewards and
2. It's an outlet that many kids thrive with at school. Most activities at schools are sports-related, and not every kid is into team sports.
I'm with you Liz.
Ideally school systems would fund instrumental teaching for its own sake, as they do in Shetland (see my post above).
But pragmatically, anything that demonstrates a wider developmental benefit must help those fighting to secure music in the curriculum.
I don't accept that public programmes need necessarily be superficial. Here's a success story reported by Laurie herself:
http://www.violinist.com/blog/laurie/20126/13624/
As I know from my own experience as a kid, with imaginative teaching music can become embedded in the very fabric of a school...
Liz nailed it.
This discussion has been archived and is no longer accepting responses.
Violinist.com is made possible by...
Dimitri Musafia, Master Maker of Violin and Viola Cases
Johnson String Instrument/Carriage House Violins
Discover the best of Violinist.com in these collections of editor Laurie Niles' exclusive interviews.
Violinist.com Interviews Volume 1, with introduction by Hilary Hahn
Violinist.com Interviews Volume 2, with introduction by Rachel Barton Pine
December 12, 2015 at 04:34 PM · There have been many attempts over the years to directly tie intelligence and the likelihood of success, both academic and professional, to music.
I'm not convinced, and this video didn't help. It is too difficult to tease out cause and effect, and the video makes a number of unsupportable claims, sometimes qualified, sometimes not:
2'50": "...disciplined, structured practice...allowing us to apply those strengths to other activities..."
What other activities, besides other instruments?
2'50"": "...may allow musicians to to solve problems more effectively and creatively in both academic and social settings..."
A breathtakingly broad statement. At least they prefaced it with "may."
3'00": "... Musicians often have higher levels of executive function."
3'20": "...musicians exhibit enhanced memory functions."
(I wish--I can't remember names to save my life...)
Here's the issue: how does one separate cause and effect? Does music make you smarter, or does attempting to study music quickly filter out those lacking fine motor control, discipline, and memory to begin with? I've had students for years whose memory and creativity were static. And the ones who do succeed in other areas, such as academics, have disciplined and engaged parents and attend good schools with smart peers.
The arguments for music education seem to resemble those in favor of playing video games: that skills somehow transfer. In the case of video games, it probably helps you get good at that particular game. But the argument that somehow those skills transfer are dubious.
Music study is much more complicated, especially since, except for twins studies, you can't go back and have someone not study music and see what happens over the course of their life.
Don't get me wrong: I strongly believe in early music education. I just think we need to take grandiose claims of its benefits with a grain of salt--music should be done for its own sake.