In the big faking game to ensure first week collections, a reviewer is only a small culprit. Among the big names there is only one reviewer who is disposed to hand out benevolent ratings to producers to drum up business, which is obviously unfair to readers. The rest are doing a fair job
Like in every profession, movie reviewers come in all shades. There are trade analysts, willing to offer glowing reviews to help drive box office (BO) collections and there are others, who are courageous enough to stick their necks out and call a flop despite the producer’s clout.
To be honest, a reviewer is only a chhota culprit, in the big faking game to ensure first week collections. Stars who promote Duds (movies that simply fail to take off with both viewers and reviewers) are bigger culprits. Interestingly, among the big names a dispassionate look at the numbers throws up only one reviewers as being disposed to hand out benevolent ratings to producers to drum up business, but being unfair to readers. As we said earlier, the first week collection, especially the all-important first weekend is often influenced by the marketing blitz and star promotions. The reviewers really influence only the second week. Since reviewers are after all human, we do understand occasional aberrations, mistakes or differences in perception; however, we have also come across a pattern of consistent, undeserved high ratings by a reviewer that stand out.
We considered an exhaustive list of 38 analysts and all available ratings by them for movies included in the study. (Refer Graph: The dozen Pied Pipers of Hindi Cinema). Those with less than 10 reviews are excluded from detailed analysis of the reviewers themselves but ratings given by them retained for the movie analysis. It must be recognized that any positive or negative outcomes are purely data-driven and unintended. (Only Hindi movies released in first half of the year, between Jan-Jun 2012 and surviving at least a week at BO are considered. There is a whole alternate business revenue stream of music rights, satellite rights, merchandizing agreements, in-film advertising that is not the subject of this analysis.)
It flows from the analysis and the chart above that not all analysts do justice to the ratings. A good analyst must exploit the full range from 0 to 5 to rate movies, especially when viewers reject movies frequently. If a reviewer gives a 3 and 4 star rating for all movies reviewed by him/her, you are not getting a worthwhile opinion. A reviewer can be purely value-based and unconcerned by BO collections; or can be purely a trade-based critic. But the review must be qualified by a disclosure. (Reviewers whose ratings were included in the research are—Rajeev Masand, Taran Adarsh, Shubhra Gupta, Anupama Chopra, Blessy Chettiar, Omar Qureshi, Martin D'souza, Komal Nahata, Raja Sen, Madhureeta Mukherjee, Kunal Guha, Avijit Ghosh, Srijana Mitra Das, Saibal Chatterjee, Sukanya Verma, Subhash K Jha, Aniruddha Guha, Khalid Mohammed, Piyali Dasgupta, Mayank Shekhar, Kanika Sikka, Vivek Bhatia, Prashant NDTV, Shomini Sen, Mrigank Dhaniwala, Preeti Arora, Zinnia Ray Chaudhary, Resham Sengar, Priyanka Ketkar, Puja Banta, Nikhat Kazmi, Soumyadipta Banerjee, Shakti Salgaocar, Aseem Chhabra, Shaikh Ayaz, Ritu V Singh, Anirudhha Guha, Gaurav Malani)
The top two analysts of the film world head-to-head for all movies they rated in the period of study make for interesting comparison. Note that barring two movies out of 33, Taran Adarsh always rates a movie higher than Rajeev Masand. Rajeev Masand has an average ranking of 2.2 and the third horse in the race, Anupama Chopra has ranking of 2.3 as against Taran whose average is way higher at 3. For the statistically inclined—there is a significant difference in all parameters. Take away the exception—Bittoo Boss, and you know the rule. It is cognitively difficult to dissociate ratings by linking them with names; a high rating is a high rating and it fools us into an inflated opinion about movie’s worth.
The difference is more pronounced when it comes to big-budget movies. For the 10 costliest movies in the period, the average rating by Rajeev Masand is 2.15 and that by Taran Adarsh it is whopping 3.45, on same scale of 5. That Taran’s ratings are influenced by a movie’s budget, more than popular choice, is too obvious to require further explanation. It is no co-incidence either, that he is trade analyst and the bias shows. Week 2 continuation/drop of business and thereby the public verdict, supports possibly Rajeev and Anupama’s review ratings as also those of most other analysts. Even last year, the costliest movie ever made, RA One was given the highest rating of 4.5 by Taran, when most analysts could not even tolerate the movie till interval. On other hand, after the same movie, Anupama Chopra wrote a very honest article—Box Office Followers—blasting the obsession of box office with moneybags. To quote her, “The multi-crore grosses that eventually follow are then bandied about as proof of quality. The art becomes irrelevant.”
The Fakestars— the film industry helps create them because of what rides on them and not because of intrinsic merit of the movie reviewers can help build them up with an extra star. Check our film reviewer rating comparator for all movies in the Fakestar category for the top three reviewers (See graph: Aggressive promotion, inflated ratings and reality strikes in week 2). For seven out of eight movies, Taran Adarsh has more generous ratings than the other two. The eight movies listed below accounted for 50% production cost of the industry’s cost of making 33 movies in the period under study.
Names like Agent Vinod (Rs60 crore), Agneepath (Rs55 crore), Rowdy Rathore (Rs50 crore), Players, Housefull 2 (Rs45 crore each), Ek main aur ek tu (Rs35 crore), Teri Meri Kahaani (Rs30 crore) are mostly movies that were either rejected outright by the public or were mediocre. Such big budget movies that are no Kahaani or Vicky Donor need help from reviewers to prop them up and some seemingly oblige with extra starts.
I learnt from Satyamev Jayate that cynicism alone does not help. Provide positive case studies, poster-boys and solutions, rather than just criticize. So, pikchar abhi baaki hai … we will return, Break ke baad, with ideas for positive change. In the meantime, bouquets and brickbats welcome.
Some material disclosures: The author has no business interest in movies/production houses/media companies. Box office data collected from trade journals. While all available known sources of ratings by analysts online were included, some analysts may have reviews which are not included. Such corrections, though not material for large movies, can be made if brought to my notice. Subjective views of analysts are considered to be in harmony with and condensed in overall rating; US and UK box office data is not included and separate analysis is done and available with some very interesting insights. It does not interfere much with analysis here.
(Sandeep Khurana is the founder and principal consultant, QuantLeap Consulting services, based at Hyderabad. An ex-army officer, he is well-read and experienced in government and corporate sectors. Sandeep holds a management degree from Indian School of Business. He has interest in social media, analytics and operations. He likes to watch all good movies. He can be reached at [email protected] or his twitter id is @IQnEQ.)
Inside story of the National Stock Exchange’s amazing success, leading to hubris, regulatory capture and algo scam
Fiercely independent and pro-consumer information on personal finance.
1-year online access to the magazine articles published during the subscription period.
Access is given for all articles published during the week (starting Monday) your subscription starts. For example, if you subscribe on Wednesday, you will have access to articles uploaded from Monday of that week.
This means access to other articles (outside the subscription period) are not included.
Articles outside the subscription period can be bought separately for a small price per article.
Fiercely independent and pro-consumer information on personal finance.
30-day online access to the magazine articles published during the subscription period.
Access is given for all articles published during the week (starting Monday) your subscription starts. For example, if you subscribe on Wednesday, you will have access to articles uploaded from Monday of that week.
This means access to other articles (outside the subscription period) are not included.
Articles outside the subscription period can be bought separately for a small price per article.
Fiercely independent and pro-consumer information on personal finance.
Complete access to Moneylife archives since inception ( till the date of your subscription )
The award of rating is started since the modern marketing evloved for a last decade. Earlier ,there was no 'Star rating sysytem' for the movie reviews. Now this PR excercise has become a 'Racket' in spending black money on such 'Marketing.
Inside story tells that such ratings are awarded not by the reviewers who actually wrote the review,but by the 'Marketing' or Edit desk of the media which has a privilege to alter /edit any text . This is influenced by the 'Money power' a PR set up can spend (Indirectly the producer).If one reads inside actual review, it is almost in all cases "Incosistent' with the ratings . The film is blasted inside review ,but still gets 3 or 3.5 rating.
It is high time that the media
weilds some responsibility of abolishing this influenced rating system for film reviews.
Mohan Siroya
(Freelance Film Critic for last 50 years)
Thanks for your detailed feedback and affirmation as an industry expert on the subject, of the malpractices, our analysis too supports objectively.
Ratings inconsistency with subjective comments can also be scientifically analysed to prove the manner in which it is being done, as stated by you. We were quite keen to include PR(marketing budget) vs production budget in analysis but were hampered by unreliable/non-existent data in most cases.
Regarding names etc in ratings, I stand by the data, my honest disclosures in not being associated with in any manner with any film entities except being fan of few random stars changing often. Material disclosures stated(to requote: While all available known sources of ratings by analysts online were included, some analysts may have reviews which are not included. Such corrections, though not material for large movies, can be made if brought to my notice.)Srijana Mitra Das is part of 38 analysts incl in study but has less than 9 reviews available during the period and all others featured in graph have more. With part-time reviewers like SM Das, sample size of their reviews is too low to draw meaningful inferences. The top 4 have two-thirds or moreof released movies reviewed during the period, each three times or more than what SM Das did, just as a matter of comparison. Plus readership is limited if they are part-time. There is no other reason to exclude SM Das, reasons being purely data driven.
Your observation on 5 names is supporting the premise of the article that many do it. Reason only one is singled out for comparison is to draw more inferences based on larger samples and as elaborated in further analysis where movie budgets are an influencing factor, jacking up the median and gap from other analysts much higher, for this one analyst. If others look better or worse than him through data, so be it and I agree there. Being industry veteran you would know extent of influence both ways- between movie-critic and beneficiaries of analysis- for each name to support such inclusion.
We could discuss on some constructive suggestions you make to remedy the malaise.
By the way, the readers are not concerned whether a reviwer is a staff or freelancer. But the 'Inner corruption' works for both.If a staffer is invloved, he too gets his pound of share.
I have already suggested a way out to control this malaise to some extent . Abolish the 'Star Rating System'.Readers will not get swayed by mere rating, but will gothru' teh contents to judge the merit .
Since we can not expect the 'Principled' film critics like late Bikram Singh or the S Banaji in this polluted system of Media Control; the cinegoers will be the best judge to rely on such reviews or reviewers. Even if some critic wishes that the 'Truth' should prevail, irrespecive of the consequences, no media will dare to publish it.
As for the 'Compromising Reviewers' are concerned, can we really blame them ? Have they also not to survive in this "Corrupt System"?
To come out of such predicament as an honest critic, I had to start my own Journal "Shadow Play" in print to write what I felt just and right, without favour or fear.Now the same is converted into a WEB Portal "shadowplayindia.com". If U care to go thru' my psycho-analytical crtical reviews of a few top films ( including the OScar Winner 'Slumdog..") U may get the feel of it.
All the best and regards
Mohan Siroya
Also, you are probably an exception. Most journalists in all disciplines - business, politics, films, auto, etc compromise for access. Yes, you dont need access to review and write critical stuff - one can say the same about cars - but you do know that journalists still want the 'access' right? This is especially true in politics and business.
Yes. If you can share data for Hindi/regional languages reviewers we can include that too. But it would not make English-language based reviewers any better or worse as they are benchmarked against public opinion than just relatively.
2.It is an eye opener that most of the movies that are supposed to be big hits run for a week. Take the case of Bol Bachchan.Supposed to be Rs.100 crores.But it ran only for a week in most places. Similarly Rowdy Rathore may have just covered the cost in the 1st week. No one knows the details.
3. Years ago, Bikram Singh had ripped a movie apart made by a well known film maker. The movie became a golden jubilee hit. Can you guess which movie it was ? It was "Bobby".
4.For long, it has been understood and agreed that the critics' comments about a movie are diametrically opposite to public opinion.But now that is changing.
5.I completely agree with Sandeep that most reviews that I read are all extremely positive. Can you believe even something like Kya Kool Hai Hum got some wonderful rating. So, the truth is there are no real reviewers , only paid moon lighters who are part of film's PR strategy.
6.Look at the way the film stars shamelessly appear on TV to promote their film. Aamir Khan, Sunny Deol were known to use media like a curry leaf - using them when needed and discarding them later on.
7.Lastly, how many middle class families actually venture out to a multiplex and spend Rs.2000 to watch a movie by 4 persons? In any case, most movies are telecast on TV within 3 months of their release.I feel that no one goes and watches a movie because the review was good and no one abstains from going to a movie because the review was bad. At the end of the day, all opinions expressed by reviewers are subjective.
Tamil Nadu. Ananda Vikatan a popular Tamil magazine had criticised a movie made by K Balachander in the 80's.They used to give marks - 60/100, 70/100 etc but the reviews were awesome. K Balachander was so incensed by the review, his ego was hurt so much that the matter turned out into a big thing and a hurt Ananda Vikatan decided to stop reviewing films for the next 2 years. Ananda Vikatan's editor and K Balanchander were at loggerheads way back in the late 60'S and this incident opened up the earlier wounds. Just sharing a bit of nostalgia.