I would love to see similar data to what we have on exams, such as question type strengths and weaknesses by percent correct, and the ability to filter the results by time: i.e. last month, 3 months, week, or year.
I would like to have the LSAT Test Analytics updated to have the ability to incorporate data from drills. Right now it is only full PT's, but having the ability to see analytics from drills would be amazing.
I struggle a lot with timing, so I would love a way to know which questions types I should strategically skip on the first run based on how much time I spend on it, and still get it wrong.
E.g. I tend to spend 50 seconds over target time on Match the Flaw questions, and still have a 10% accuracy. Maybe an average on over-target time for specific question types mentioned beside the accuracy, only on timed drills/PTs.
Thank you for posing this question! I've thought about this a lot without really realizing it, so appreciate the opportunity to dump my thoughts out to a willing recipient! See a list of stats I would find relevant below
Timing -- it would be beneficial to have a breakdown of how much time I tend to spend on different question types, potentially with a normalizer for difficulty level. This could come in the form of an average and a standard deviation (i.e. on average, you spend 1 min 20 seconds on LR Sufficient Assumption questions, but you spend 45 seconds, or 2 standard deviations, longer on the most difficult questions of this type). A metric of this sort would would give me a better sense of how to pace myself on PTs.
Accuracy by Question Type -- I love my LR and RC Drills, and I would say I spend > 75% of my time on 7Sage engaging with this aspect of my membership. On the whole, I have likely answered 10x the number of questions in Drill mode than in PTs. This makes it all the more frustrating when I have no easily accessible stats for Drill questions, as I know that the supposed "trends" 7Sage provides are based on an entirely unrepresentative sample (my PTs). I would love if we could access a red-circle chart, similar to the one created based on PTs in "Analytics," for Drills (or, even better, Drills AND PTs together). One day, I would hope that the virtual tutor could tell you to focus on a specific question type based on your performance without prompting (as an aside: please make the role of the virtual tutor clearer on the Drills page, as it is currently a bit ambiguous).
Section-Level Time Spent -- Somewhat along the same lines as (1), I would love a statistic telling you how much time you've spent answering drill questions for RC vs LR (more granular would be interesting, but I don't personally see access to more granular info impacting my studying much). I am sometimes conflicted about where to focus my studying on any given day, and I think a summary of my time spent (a) ever, (b) this month, and (c) this week could be a great way to show where I should be concentrating my efforts. This stat can also act as a motivator / indicator of progress for those of us who are more Drill inclined. It would be great if every time I logged in to practice, I was greeted with "XX hours spent on Drills!" or something of that ilk.
an option to attempt question again without the answer being in view for a full drill or single question;-just indication it's wrong and try again or reveal answer
@Landon_brough said:
I would love to see similar data to what we have on exams, such as question type strengths and weaknesses by percent correct, and the ability to filter the results by time: i.e. last month, 3 months, week, or year.
This is the most important factor for me. I am someone who has done a majority of their studying using drills and this would be a big help.
This is maybe a little out of left field, but an analytic leveraging PCA (principle component analysis) to bucket 7sage users and then suggest drills based on classification seems like conceptually it could really helpful in identifying what to spend time on to study most effectively. Basically, if you're able to group students using the super-rich datasets you've got, and then compare PT improvement for students within those groups based on different drill histories it could help identify what drilling techniques work best for each student's "type" -- it seems like peoples' difficulties in some ways fall into categories, and this could be a cool way to leverage the data of other students in your archetype.
this is pertaining to PT's. There should be an option to blind review directly after a section & to review the answers rather than having to take the whole test before being able to review. Some days i am not able to take the full test in one sitting so when I go back to review the questions i got wrong I have already forgotten my reasoning for that question.
@"David Busis" I would like to be a beta tester, it would be interesting if it had the said ai feature that would build drills based on the your analytics.
Analytics on every question that I do. Then I can filter out by test, full-length sections, and practice sets. Obviously question type and difficulty as well. Having a date filter option is awesome too.
Drill analytics, for one. Also, I don't know if this would be practical or not, but categorizing questions further - maybe using a view that can be toggled on and off so as to avoid overwhelming the person studying. What I mean by the former part (categorizing questions further) could look like this: you get a MBT question in an LR section. It requires your knowledge of the fourth valid argument form and contains math in the stimulus. Being able to repeat a very similar type of question (with those three areas involved - MBT, math, valid argument form) would be SO useful!! I make the same mistakes sometimes, and though I recognize it, it's hard to efficiently drill the more specific weak spots because I can't find a similar enough question to work on next. It usually happens that I encounter similar ones after a long time of working on other questions (despite working on a similar question type, such as MBT), so I end up making the same mistake again when perhaps that could have been avoided with a more specific categorization system in place. What I mean by the latter part of my idea (the toggling) is that if there was a specific categorization system, learners starting out may think there are too many tags and feel overwhelmed. So, having categorizations like "MBT" (a big question type) and "Math" (a big stimulus type) always toggled on are okay, but a more specific tag like "Valid Argument Form" (a more narrow knowledge area that could happen to appear in question categories from MBT to Wkn and anything in between) could have its visibility toggled off by someone who would rather not get caught up in too many specifics while gaining an introduction to the test. Again, I don't know if this is a very practical suggestion, but something like it would make a WORLD of difference in efficient studying, I think!
Comments
I would love to see similar data to what we have on exams, such as question type strengths and weaknesses by percent correct, and the ability to filter the results by time: i.e. last month, 3 months, week, or year.
Thanks @Landon_brough !
I would like to have the LSAT Test Analytics updated to have the ability to incorporate data from drills. Right now it is only full PT's, but having the ability to see analytics from drills would be amazing.
I agree with Matthew. Not everyone takes full-length PTs. So it would be helpful if the drills analytics are included.
Would it be possible to have analytics on timing?
I struggle a lot with timing, so I would love a way to know which questions types I should strategically skip on the first run based on how much time I spend on it, and still get it wrong.
E.g. I tend to spend 50 seconds over target time on Match the Flaw questions, and still have a 10% accuracy. Maybe an average on over-target time for specific question types mentioned beside the accuracy, only on timed drills/PTs.
Thank you for posing this question! I've thought about this a lot without really realizing it, so appreciate the opportunity to dump my thoughts out to a willing recipient! See a list of stats I would find relevant below
Timing -- it would be beneficial to have a breakdown of how much time I tend to spend on different question types, potentially with a normalizer for difficulty level. This could come in the form of an average and a standard deviation (i.e. on average, you spend 1 min 20 seconds on LR Sufficient Assumption questions, but you spend 45 seconds, or 2 standard deviations, longer on the most difficult questions of this type). A metric of this sort would would give me a better sense of how to pace myself on PTs.
Accuracy by Question Type -- I love my LR and RC Drills, and I would say I spend > 75% of my time on 7Sage engaging with this aspect of my membership. On the whole, I have likely answered 10x the number of questions in Drill mode than in PTs. This makes it all the more frustrating when I have no easily accessible stats for Drill questions, as I know that the supposed "trends" 7Sage provides are based on an entirely unrepresentative sample (my PTs). I would love if we could access a red-circle chart, similar to the one created based on PTs in "Analytics," for Drills (or, even better, Drills AND PTs together). One day, I would hope that the virtual tutor could tell you to focus on a specific question type based on your performance without prompting (as an aside: please make the role of the virtual tutor clearer on the Drills page, as it is currently a bit ambiguous).
Section-Level Time Spent -- Somewhat along the same lines as (1), I would love a statistic telling you how much time you've spent answering drill questions for RC vs LR (more granular would be interesting, but I don't personally see access to more granular info impacting my studying much). I am sometimes conflicted about where to focus my studying on any given day, and I think a summary of my time spent (a) ever, (b) this month, and (c) this week could be a great way to show where I should be concentrating my efforts. This stat can also act as a motivator / indicator of progress for those of us who are more Drill inclined. It would be great if every time I logged in to practice, I was greeted with "XX hours spent on Drills!" or something of that ilk.
an option to attempt question again without the answer being in view for a full drill or single question;-just indication it's wrong and try again or reveal answer
This is the most important factor for me. I am someone who has done a majority of their studying using drills and this would be a big help.
This is maybe a little out of left field, but an analytic leveraging PCA (principle component analysis) to bucket 7sage users and then suggest drills based on classification seems like conceptually it could really helpful in identifying what to spend time on to study most effectively. Basically, if you're able to group students using the super-rich datasets you've got, and then compare PT improvement for students within those groups based on different drill histories it could help identify what drilling techniques work best for each student's "type" -- it seems like peoples' difficulties in some ways fall into categories, and this could be a cool way to leverage the data of other students in your archetype.
This isn't an analytic but it would be helpful to be able to drill one LR question at a time, and immediately review, instead of the minimum of five.
@"Jaz Williams" we have that feature on our new website, still in beta. Let me know in these comments if you’d like to be a beta tester.
this is pertaining to PT's. There should be an option to blind review directly after a section & to review the answers rather than having to take the whole test before being able to review. Some days i am not able to take the full test in one sitting so when I go back to review the questions i got wrong I have already forgotten my reasoning for that question.
@"David Busis" I would like to be a beta tester, it would be interesting if it had the said ai feature that would build drills based on the your analytics.
I wish I could filter out variable sections for analytics on PTs--I skip them since I'll be skipping on test day and they really skew my averages.
Analytics on every question that I do. Then I can filter out by test, full-length sections, and practice sets. Obviously question type and difficulty as well. Having a date filter option is awesome too.
Drill analytics, for one. Also, I don't know if this would be practical or not, but categorizing questions further - maybe using a view that can be toggled on and off so as to avoid overwhelming the person studying. What I mean by the former part (categorizing questions further) could look like this: you get a MBT question in an LR section. It requires your knowledge of the fourth valid argument form and contains math in the stimulus. Being able to repeat a very similar type of question (with those three areas involved - MBT, math, valid argument form) would be SO useful!! I make the same mistakes sometimes, and though I recognize it, it's hard to efficiently drill the more specific weak spots because I can't find a similar enough question to work on next. It usually happens that I encounter similar ones after a long time of working on other questions (despite working on a similar question type, such as MBT), so I end up making the same mistake again when perhaps that could have been avoided with a more specific categorization system in place. What I mean by the latter part of my idea (the toggling) is that if there was a specific categorization system, learners starting out may think there are too many tags and feel overwhelmed. So, having categorizations like "MBT" (a big question type) and "Math" (a big stimulus type) always toggled on are okay, but a more specific tag like "Valid Argument Form" (a more narrow knowledge area that could happen to appear in question categories from MBT to Wkn and anything in between) could have its visibility toggled off by someone who would rather not get caught up in too many specifics while gaining an introduction to the test. Again, I don't know if this is a very practical suggestion, but something like it would make a WORLD of difference in efficient studying, I think!