• C++ Programming for Financial Engineering
    Highly recommended by thousands of MFE students. Covers essential C++ topics with applications to financial engineering. Learn more Join!
    Python for Finance with Intro to Data Science
    Gain practical understanding of Python to read, understand, and write professional Python code for your first day on the job. Learn more Join!
    An Intuition-Based Options Primer for FE
    Ideal for entry level positions interviews and graduate studies, specializing in options trading arbitrage and options valuation models. Learn more Join!

Why are most reviews clumped together in timing to specific days?

Joined
2/14/23
Messages
563
Points
223
I'm always a little suspicious when certain programs turn bad reviews around with a flurry of glowing (exclusively 5-star) reviews that drop in batches.

In theory, there could be reasons for this (maybe friends are all finishing programs and decide to drop reviews together?) but I find it somewhat unlikely. The option listed in the poll is my best explanation, but I can't discount the possibility that career/admissions offices want to paint a glowing picture of programs.

@Andy Nguyen, I assume the profiles are vetted to make sure actual students are placing these reviews. Do you have any doubts as to the validity of some reviews?

Edit: I messed up and the poll disappeared. The options were:
A: The career/admissions office's hand-pick students and tell them to write a review
B: Students see a review of their program and want to add their thoughts
 
This was brought about by the three recent CMU responses; while the program is very admirable, the timing of reviews (not in this program specifically, go look at UChicago which got half a dozen some days) always felt weird to me.

There is a third option, actually the most likely now that I've thought of it, which is that @Andy Nguyen receives these separately along with proof of graduation or something that acts as verification and has to go and manually post them to each program. This would be much like the reviews of the C++ course, which he posts and then changes to be from that persons account. The delay between receiving and posting would explain the batching, and match up with other functionality on this site.
 
I’ve also noticed this but have always attributed it to career services asking students to write reviews rather than them being fake reviews from admissions officers. If you take a look at some of the accounts writing them you’ll see long post histories which indicates they are real accounts.

Maybe a current student can confirm or deny if they were asked by their program’s career services to write a review

Edit: I’ve noticed that some accounts choose not to post their reviews anonymously which means you could verify whether or not they attended the program via linkedin
 
I’ve also noticed this but have always attributed it to career services asking students to write reviews rather than them being fake reviews from admissions officers. If you take a look at some of the accounts writing them you’ll see long post histories which indicates they are real accounts.

Maybe a current student can confirm or deny if they were asked by their program’s career services to write a review
I believe it's a real student writing it, I'm just wondering if the colleges are attempting to stack good reviews by only asking students they believe will write good reviews. I'm sure this is being done, but I'm not sure how large the practice is.

I'm also hoping that current students who wrote reviews can comment on this, and Andy, if he is actually batch uploading them.
 
I believe it's a real student writing it, I'm just wondering if the colleges are attempting to stack good reviews by only asking students they believe will write good reviews. I'm sure this is being done, but I'm not sure how large the practice is.

I'm also hoping that current students who wrote reviews can comment on this, and Andy, if he is actually batch uploading them.
Good point, I’m also curious to hear the answer now
 
I think you guys make some good observation. Most reviews that are posted in a short period of time are likely results of the program's push to get more alum to leave a review.
And majority of the reviews are genuine from real students. If someone wants to contact us to get their account verified, they can still leave an anonymous review but it will show up as verified. It's something we are working on. It gives their reviews more weight and prospective students can benefit greatly from it.
For the record, the reviews are submitted by members without our involvement.
 
It's happening again. Do you think these recent reviews bring any substance that would help prospective applicants to make a more informed choice?
If no, then are there any chance to the format so we can get more value?
The reviews are great, I hold them in a relatively high regard but they are not perfect. Even if we can verify that reviews are from real students it's impossible to know how representative that small sample is of the entire class. Mike raised a good point that programs could be purposefully only asking top students to leave reviews which would skew the results positively.

I can't think of a better format than what quantnet currently uses, I just think that people should be mindful when using the reviews to make big decisions.
 
It's happening again. Do you think these recent reviews bring any substance that would help prospective applicants to make a more informed choice?
If no, then are there any chance to the format so we can get more value?
That first review for the SIT program looks like pure marketing written by a bot; I can't think of a real student who would write a review using that many buzzwords and overly positive language.

I can't think of a better system, but a verification process before leaving a review is the easiest short term improvement that can be tried. It would be interesting to see if there is a shift in the quantity and/or quality of reviews.
 
I'm going to work (attempt to try) on some filters so that reviews by verified members will be displayed as default. And you click on a button to show all reviews.
I think it's a workaround that is reasonable.
In the long term, the only way we can encourage more genuine reviews from more verified members would be to show a strong case for them to doing so. The question for them would be "why would I want to sign up and write a review? What is it for me?"
Just like universities asking alumni to donate/contribute or business asking clients to leave a review.
 
I'm always a little suspicious when certain programs turn bad reviews around with a flurry of glowing (exclusively 5-star) reviews that drop in batches.
By the way, in our next update, you can filter the reviews by star rating. Clicking on 1 star will show all reviews with 1 star and so on. Just make it easier to find the negative reviews.
When I shop on Amazon, I do spend time reading all the negative review of a product first.
 
I'm going to work (attempt to try) on some filters so that reviews by verified members will be displayed as default. And you click on a button to show all reviews.
I think it's a workaround that is reasonable.
In the long term, the only way we can encourage more genuine reviews from more verified members would be to show a strong case for them to doing so. The question for them would be "why would I want to sign up and write a review? What is it for me?"
Just like universities asking alumni to donate/contribute or business asking clients to leave a review.
How does someone even become a verified member. After all these years of QN, there are only six.

So this filter will eliminate effectively all results.
 
How does someone even become a verified member. After all these years of QN, there are only six.

So this filter will eliminate effectively all results.
Using their real name that I can verify. If using a random username, please contact me to verify their ID via their LinkedIn profile.
I don't have a robust mechanism to verify yet. Right now, send me a DM here linking to their LinkedIn profile. I will connect with them via LinkedIn to verify.
 
I can think of a way to prevent these clumping. Set a time limit of 12 hours or a day, within which a new review to a program can be added. Prevents the mass surge of these consecutive review bursts that are not helping anyone, since these are all way too positive and glowing.
 
Using their real name that I can verify. If using a random username, please contact me to verify their ID via their LinkedIn profile.
I don't have a robust mechanism to verify yet. Right now, send me a DM here linking to their LinkedIn profile. I will connect with them via LinkedIn to verify.
If you ask me, the only thing that really matters is the QN ranking. Those are actual outcomes versus personal opinion.

If I were to consider the reviews, the only ones that would really matter are the recent negative ones. As that might be a signal that a program is on the decline before that showed up in rankings.
 
Nah, Quantnet ranking is a single number. People's needs & priorities are diverse and reviews are valuable for that reason. There are a lot of details in those, which help.
 
Back
Top