TEQSA Occasional Forum Series – First Year Student Attrition in Higher Education

TEQSA Occasional Forum Series – First Year Student Attrition in Higher Education


>>ANTHONY MCCLARAN: I’m Anthony McClaran,
CEO of TEQSA and I’d like to welcome you all here. Those of you physically
present and those of you watching remotely. Can I begin by acknowledging the
traditional owners and custodians of the land on which we’re meeting today. The Wurundjeri
people of the Kulin nation and pay my respects to their elders, both past
and present. I mentioned those who are watching this remotely. Do feel free to enhance
the remote conversation by tweeting. You’ll see the hash tag up there. #teqsaforum.
And also if you want Wi-Fi, the network is EA Victoria, the password is [email protected]ctoria
if you see what I mean. V and then 1 and then the rest of the word Victoria.
The notice that tells me what the Wi-Fi address is says this is a shared space so
please be courteous and clean. So, with that admonition ringing in our ears, I want
to proceed to this afternoon’s event. And first of all to introduce our three speakers. First of all, Lin Martin. Dr. Lin Martin who’s
been a Commissioner at TEQSA since February 2015. Lin’s had a very distinguished
career in university administration and management from 2009 to 2011. That’s lovely.
She was head of university services and Vice Principal at the University of Melbourne
where she also held the position of Vice Principal and academic registrar from
2001 to 2005. From 2005 to 2008, Lin was Vice President and council secretary at
Deakin University and has held many other senior positions. Our second speaker will be Professor Pip Pattison,
appointed deputy Vice- Chancellor Education at the University of
Sydney in June 2014, responsible for that university�s strategy and vision for teaching
and learning and students� educational experience. She oversees institutional wide
development of better support for student learning, including the university’s
approach to curriculum renewal, new thinking in pedagogy, learning and teaching
analytics, e-learning, Quality Assurance for learning and teaching. An exhaustive exhausting
portfolio Pip. And finally, to welcome Professor Nick Saunders,
Chief Commissioner of TEQSA since 2014. Nick was provost and Deputy VC
of Bond University after retiring. I use those words very much in inverted commas,
as Vice Chancellor of the University of Newcastle. Other senior academic roles for
Nick have included Dean of Medicine Nursing and Health Sciences at Monash University,
head of the Faculty of Health Sciences and Dean of Medicine at Flinders.
Professor of Medicine at the University of Newcastle, and also positions at McMaster
University, Canada and Harvard USA. So, welcome to all three speakers. All here
to talk this afternoon at the first of the TEQSA occasional forum series, and the subject
is student attrition in higher education. Very topical and I think the response
to this forum and the numbers of people who’ve come here this afternoon is
an indication of the huge interest of this subject at the moment. It’s of interest to
TEQSA in many ways but very specifically, because attrition is one of the 12 risk indicators
in the risk assessment framework that TEQSA uses and it is by some measure
the indicator most likely to trigger either a high or moderate risk rating. 38% of providers
at high risk on levels of attrition, 21% at moderate risk. It’s a subject that
has attracted the attention of the Higher Education Standards Panel, and in their report
on admissions transparency, which some of us here have been involved with, they
also said that further consideration should be given to assessing the factors and
approaches that contribute to student success, completion and attrition rates in
higher education. And the minister picked up that particular point and many of you will
know and indeed may well already have responded to HESP�s discussion paper, Improving
Retention, Completion and Success in Higher Education which they issued
last month. And there’s also a wider policy dimension flagged up in the higher
education reform package that the Minister published in May where there was some discussion
about performance-based funding in the future and the performance
measures to which that might be linked. Much details still to be filled in there,
it’s true to say and Lin will no doubt indicate when she speaks, that our report predates
all of those more recent policy developments by some way, but it’s inevitable
in that particular climate, that of course, elements in this paper will become
part of that discussion and debate. I think in terms of a further introduction,
before handing over to Lin, I can do no better than use the words that are right at the beginning
of the report. Over the last 20 years, there’s been significant interest in
factors leading to student dropout from first- year higher education studies. Factors identified
include a range of personal attributes of the students themselves as well
as academic and administrative aspects of higher education institutions operations. Concern over attrition is primarily centred
on financial and reputational issues for governments and for the institutions, but
the report reminds us the issue is of considerable significance for the students
themselves in terms of wasted time and personal debt. So, without any further ado,
Lin, if I could ask you to present the report. Thank you. [Applause]>>LIN MARTIN: Thank you, Anthony, and yes,
what an array of interesting people before me. It�s interesting to see what
200 chairs look like. But we managed to build the image. Thanks, Anthony, for the opportunity
to talk about the report. I think the key thing about this report and the analysis
that we get is that it�s the first attempt to
look at efficient education across the whole sector. A lot of public information about
universities and their attrition and so forth but nothing really of a consolidated nature
to do with the non-university higher education providers. So that was one of the
things that we really wanted to achieve by preparing this report and I think there are
copies, somewhere, copies of this report and so please feel free to come and get a
copy after the discussion has finished, and because there are some quite interesting
appendices and so forth in the report. So that�s what it looks like. As Anthony said, TEQSA uses 12 performance
indicators to assess whether institutions that it is responsible for regulating
have�it identifies some level of risk in terms of these 12 indicators and one of them,
as Anthony said, attrition, is the one that generates the biggest proportion of institutions
with higher than medium risk. It really stands out when you look at the indicators.
So, the Commission was interested, therefore, to say well what is
it about this fairly large spectrum of institutions that we regulate, what is it
in the profile or the student profile or the things, the characteristics of the institution
themselves that we might be able to associate with higher levels of attrition
rather than lower. So that was how the project started, and to take that a bit further, how
could it assist TEQSA in identifying potentially high-risk institutions, and through
the conditions and the data that we request as part of the regulation process,
perhaps we might be able to identify some strategies that would be good practice strategies
to lower attrition. So, that was the basis of the project. We
worked on the data set that corresponded to the most recent year at the time we started
the project. So, we were working, we started working in the second half of 2015
and we were using 2014 attrition data. And there were 173 institutions registered
with TEQSA in March 2014 which corresponded to that data year. And 18 of
those 173 didn’t have sufficient data to actually calculate attrition rates in the
way we do and the way we define. And the reason for that is they might have just be
starting up or they might have been tapering off to stop working as an institution.
And so that 18 out of 173 really didn’t have sufficient data or data in which we had
a great deal of confidence, perhaps, is the way to put it. And so, if you took that 18 away, that left
155 and we thought about those, but then when we started to look at the nature of the
data that we wanted to consider, in fact a number of the private providers do not submit
their statistical data through the Higher Education Information Management System,
or HEIMS, and in fact, we couldn�t get the same range of data for
all the institutions across those two collections. So, in the end, we succumbed
to lessening the number but to get a complete data set. And so, we ended up with
130 institutions which of course, the public universities are 39 in number, so that�s
a very large proportion of these are non-university higher education providers. So, what was the definition that we used?
It�s the one that�s articulated in TEQSA�s Risk Assessment Framework, which is quite
a technical document and it�s got all the actual definitions and the formulae as to
how these 12 performance indicators that TEQSA has identified, how they’re calculated.
And the definition is there. It�s the same sort of definition that�s used by the
Department of Education and Training of the university sector, and it’s the same definition
that’s been used for many, many years. And there�s been a continuous time
series of data, way back to the early 2000s, at least in my private library at home.
And public documentation that’s been around in the sector for a long time for the
universities, but there are two critical distinctions. This definition, as you see there, it doesn’t
differentiate between undergraduate and postgraduate students, so in our attrition
rates at TEQSA, we look at a first-year student who is studying postgraduate courses…
well, do they continue or do not. So ours includes both undergraduate and postgraduate.
And what TEQSA has done here is that our attrition rates are for the
whole populations, first-year students, but we look at students enrolled in all sorts
of levels of courses, so undergraduate and postgraduate. And the difference also is that
we include domestic and international students, and the Commonwealth only includes
in their analyses, domestic students. The final thing that there’s been a bit of
controversy about I suppose, is that we use what we call raw attrition rates and it’s
just using that calculation. But the Commonwealth publishes adjusted attrition
rates. Now the adjustments are made using the student
identifier, the CHESSN, which is an allocated number, a whole of life number
for a student in higher education in universities. But for example, and the reason
that the CHESSN is collected is to try and keep track of resources that are used
that are provided by the Commonwealth. So, the CHESSN allows the Commonwealth to
be able to track students if they say, start a course at one institution and then
they move, that they in effect leave the institution, and so they would register as
attrition in that institution. But if they go on
to continue a course in, doesn’t even have to be the same course, in another higher
education provider, where CHESSN are collected, then the loss or the attrition is
adjusted downwards by the number that have moved, still in higher education, but
not in that institution. Now, the problem is if you want to do what
our aim was, which is to look at attrition across the whole sector, then we need to have
the same data for all of the institutions, and because we’ve got international
students in our population group, we can’t use adjusted attrition because we
have no way of checking how those students, that group of students, would move
over time in terms of their tendency to stay or leave higher education. So, for those
purposes, at least for the 130 institutions that we were looking at, we have
used raw attrition rates and they are all calculated in the same way and using the same
protocols, whether or not they’re a university or another type of higher education
provider. So that’s an important difference. So, Anthony mentioned risk and how TEQSA operates,
sort of on a risk basis to identify issues that might come up in a registration
or accreditation assessment, that we look at, at TEQSA. And so, we were interested
in thinking as I said, what the proportions were, and you see this little
table. This gives the percentage of institutions that are in the high-risk area
and the low-risk and moderate over the last three years, and it’s quite a polarized distribution
because the percentage in low risk is either the same or just lower than the
percentage in high risk. Now, that’s a very high percentage of over 40 percent of institutions
of the 130 are actually categorized as high risk institutions in terms of their
risks of students through attrition. And so, what’s been happening is that it has actually
been getting slightly worse because the percentage of institutions in the high risk
is increasing very slightly but it’s sort of
moving across from low risk to moderate risk to high risk. The other thing you notice as soon as you
start looking at the whole sector is that there is a considerable diversity of institutions
in the higher education sector in Australia, and a wide range of attrition values
across these 130 providers. In fact, half the providers have attrition rates greater
than 25%. That’s the whole profile of the 130 institutions, and the orange ones
are the universities and the blue ones are all other types of higher education providers.
So, when we looked at this was quite alarming because you know, when half of the
population has an attrition rate of greater than 25%, so you’re losing one in
four of the students you recruit in first-year, you�re sort of not making as much progress
as the institutions or the Government might expect. And in fact, 25% have attrition
rates greater than 31% and the lower quartile, of course, is 16%, so about a quarter
of the 130 are below, have attrition rates below 16%. So, we started to look at this issue of what
might the characteristics be that are associated with these high rates of attrition,
and at first, we thought this is a fairly easy statistical problem. We would do a multivariate
regression model for the whole sector, the whole 130, with all of the data
that we had, and see if we could model what significant factors came out in that
regression model. But that was not a particularly good fit, the regression model
that we attempted to fit, and largely on the advice of a statistical consultant, we hired.
His name is Alonso Matta. Alonso, are you here? Yes, he’s up the back.
Alonso was keen to try clustering groups across the sector and seeing whether
if it was possible to get better fits of these subsets using the information that we
had about each provider. And so Alonso�s view was that we could do better
by segmenting the sector, and his advice was to do this using hierarchical cluster
analysis which is a sort of developmental type statistical procedure, and we used 17
variables which are described in the copies of the report; they�re in Appendix
2. In fact, the whole range of variables we used � there were 36 of them and we used
17 of the 36 to discriminate within the clusters, within the whole sector to provide
the clusters. So, and really the purpose was the cluster
analysis, in lay person�s terms that I understand, is that clustering should produce
groups which have more in common with each other than differences between each
other. And so, that was the purpose of doing it, I hope you can see that. So,
this is what using a statistical package called R, this is what the outputs look like from
that, and you can see that these clusters or
roots of this tree structure come out fairly distinctively, and the length of the bars
that are some vertical lines is an indication of
how far away from the rest of the population in the clusters that particular
group is. So, after much development work, I
suppose I’d call it, we came up with this model using the R package. And in fact, it
turns out that the clusters, cluster one consists of the universities. There is no non-
university higher education provider in that first cluster. And of course, we know that
the universities are pretty large institutions usually, and so they t have a number of
characteristics that make that but we�ll talk a little bit about that later. The second cluster is a group of very small
institutions, around about 400 EFTSL on average for each institution, and they are
a group where the predominant factor is that they have a student population that is
very much in the society and culture field
of study, and they in fact, tend to be the faith-based institutions, and there the
students are studying religious studies and philosophy and so forth and things like
that. So, they came as a group. The third one is the smallest group of the
four, the smallest cluster of the four, and that they are medium-sized institutions. They’re
about 800 EFTSL, different to cluster two but not nearly as big as the universities
on average. And that group is a group that’s focused on international students as
part of its profile, and a big enrolment in Business and Economics or Commerce. And the final cluster is a set of institutions
that enrol largely domestic students and they cover a wide range of fields of study
so that they are very different and most of the private providers are in cluster 3 or
cluster 4. So just a summary of it, I suppose. Clusters
are defined according to their profile. We looked at average values for each segmentation
variable, every variable that we use to segment the population, and that drove
the output in the R�s system. And so, as I
said, there were four groups and I think I’ve gone through those but they are very
distinctive groups when you look at them, and you take it down to a combination of
things such as the field of study, the student profile of that and where the main
catchment is, I suppose for each institution. So, these are those four groups. That
same sort of chart as the one that was the whole sector. And you notice these are
not to the same scale, so don’t just look at it and think that well that one�s worse
than that because the scales are different. But cluster 1, which was the universities
on average, no, the median rate is 18%, the attrition rate. The average is slightly higher,
it’s about 20. Cluster 2, which were the small institutions that were faith-based,
and you can see that by the incidence of the grey bars in the chart. And immediately, the
median attrition rate has gone up to 27%. Cluster 3, which was the internationally
oriented, you see the number of orange bars there, they�re the for-profit
institutions. So, these are private providers that are running their business for profit.
Interestingly, the median there, which is 20%, is not that different from the universities
one, but when you actually calculate the average, it’s about 27%. All these higher
attrition institutions are up the far end. And the final one, which is that mixture group,
but a large group of 41 institutions. The median attrition rate there is 28%, which
is quite high really. So, once we got the institutions into those
clusters and could see the different characteristics that were there in the basic
plots of the data, we then tried to fit similar multivariate regression analysis model
to each of the clusters, and this chart, if you look at the columns�Cluster 1, which
is the universities, what this chart depicts is those variables that came out through
that regression analysis as being a significant variables. So you might think
that these were the significant facts. So, if
we look down the university cluster, the factors that came out were the percentage of
external students that were there, the size, the total EFTSL of the institution. The VET one is an interesting one because
it is not the proportion of students that might be in a multi sector institution, it’s
the proportion of the intake that has come from a VET background. So students admitted
on the basis of VET of the total student population. And that one, I think,
is an interesting characteristic in terms of
the universities because it’s not something that in my experience in universities,
there�s sort of not a great tendency or hasn�t been in the past, to admit students
on the basis of their VET studies. Proportion
of senior academic staff employed. The proportion of postgraduate students out of
the total enrolment. Now, so out of the 36 variables that we had, these four, five sorry
came out as the most significant. Now the signs after them indicate the direction
that the inference is. So, plus means that a
larger number is associated with, the larger number is associated with increased
attrition. So for example, external. The higher the proportion of external students in
the institution, the higher the attrition is likely to be. With EFTSL, the lower, the
smaller the institution is, the higher attrition is found to be. So, they’re not predictors, they’re explainers
of the patterns in the data. And so, if you look, I won’t go through all of these,
but you can see how they differ between the various clusters. And there�s one really
low group, cluster 2 that isn’t a very good fit.
But getting R-squared or adjusted R-squared rates of 86%, that’s extremely good, so
it shows how coherent, I suppose, the universities are in terms of this. But 57%, 58%
is an acceptable level of variance in the model. So, you also see from that slide that
there are several characteristics like the VET appears both in cluster 1, cluster 3.
The full-time academic staff. They appear in 3 & 4, and the senior academics appear
in 1& 4. So, you can see that the models are not the same but we’re getting as a
whole a much better fit than just looking at the sector as a whole without looking at
any of the relationships that might exist between the institutions within the cluster.
So that, I suppose is the major finding from
this study, and I suppose I’ve summarized those by saying there are explainers of attrition
that are specific to certain segments, that is the clusters, but some affect multiple
sections of the higher education sector. And what one of the things was surprising
because we did include these variables in in the modelling, that our analysis didn’t
reveal any strong link to ATAR values in universities. But I must stress that our data that we’re
working with is aggregated data so we’ll be looking at, for example, the percentage of
students in the secondary school assessment that might have an ATAR value of
a certain range and so forth. But if what’s in the press was actually been of real
significance, and what’s been in the press is that the sector’s quality is declining
because the ATAR values, students are being admitted with lower ATAR values. And
in fact, that certainly doesn’t come up as a major determinant of attrition. Also,
there’s no link to low socioeconomic status etc., as appears there. And if we just look,
this isn’t to do with our study, it was just background that we looked at. Australia�s
completion rates which is of course, sort of
the inverse, I suppose of attrition, shows that Australia is about the OECD average in
terms of that. So, the OECD average is 70% and the UK and the US have
considerably better completion rates than in Australia, so that’s why this is such an
important issue. So, I’ll just finish because we�re a bit
delayed. As far as we’re concerned at TEQSA, this work does give us more information about
the factors that are involved with high levels of attrition or lower levels of attrition
in a sector, and so it sort of points the way
forward. One of the reasons that the work we did which was at the institutional level,
not the individual student level, which is what the cohort analyses do. What we do at
TEQSA is look in terms of regulation, the registration of an institution. So, we’re
interested in institutional level data, less about what the specific mix looks like in
terms of the students, though we do collect some data on that. And so we thought
that an approach like this, which is somewhat different to the usual approach, which
is usually cohort analysis at the individual student level. So, what we thought was
that this approach of looking at institutional data, looking at clusters and groups
fitted, very well with the regulatory processes that we currently use, and that that’s
why we did it. So, I’ll leave it there and hope that the PowerPoint holds up for you. [Applause]>>ANTHONY MCCLARAN: Thank you very much Lin
and thank you for your stoic reaction to the technical difficulties. Without
any further ado, could Pip Pattison now come and Pip is going to respond to some of
the points made in that research. Thanks Pip.>>PIP PATTERSON: Thank you Anthony and thank
you Lin. It is a pleasure to be here to respond and I would like to start
by thanking TEQSA for initiating this kind of
analysis. I think the more we discuss important issues like attrition rates with a lot of
data in front of us, the much more progress we’ll make on really identifying those
factors that make a difference. So, I’m going to briefly touch on four points, and I�ll
go as quickly as I can, because I know we’re delayed but I’ll start by saying why I
think this institutional level analysis is so important. I’ll make a few statistical
remarks about this kind of analysis which Lin hasn’t
emphasised but which are actually clearly set out in the report, and then just a few
brief remarks on what I think we’ve learned and where we might go next. So, to begin with, of course this analysis
is helpful because it’s more analysis, it’s more evidence, it increases our understanding
of our sector, and I think I particularly point to our capacity to have a much richer
understanding of the variation and heterogeneity in the sector. But in addition
to that, and it is certainly the case for people in roles like mine, we often think
about issues like attrition or completion rates
for students as institutional problems, and we set strategies at institutional level.
So, understanding the picture from an institutional
point of view is actually very important and very helpful for us. I think it’s also
very interesting that the analysis that was published in the higher education standards
panel report: Improving retention completion and success, showed actually that
institutions were a major source of variation in modelling attrition at the student
level which points to the value of actually looking at the institutional level
as a way of advancing understanding. And then finally, as many people have commented
about the Australian higher education sector, it is a heterogeneous one and the
more we can do to understand that the better and I think the analysis that Lin has
just described both improve our understanding of the sector and its characteristics
but also how they play into the prediction of attrition. And here’s just a statistical point. I used
to teach statistics so I can’t help myself. It is
important, of course, with any study that takes a body of data and fits a regression
model, we obviously can’t infer causal relationships from a regression or any kind of
correlation. We can’t conclude that there are corresponding correlations that we
identify through this analysis at the institutional level. We can’t infer that they also
exist at the level of individual students, and that’s an important point, and I think
that point that ATAR that Lin just emphasized is
a good example of that. We do have the entire population of institutions in this
analysis so that makes for some interesting thoughts about the nature of inference in
this circumstance. I can elaborate on that if
you want later, and of course, in any study that aims to provide an explanatory
model, what’s missing from the model needs to be in the back of our minds always,
and so there might be some important characteristics of institutions that are not
included here that had we included them, might change the picture we see. I was
reading a US study recently which used expenditure, per student expenditure on
student services as a predictor and found that to be quite important. So that would
be an example of such a thing that might be missing, and of course this is, as Lin
has said, a very exploratory analysis. But we’ve learned a lot from it I think. I
think that the description of the sector that Lin
just gave us really articulated a much richer and better sense of the sector as a
whole and that’s valuable. And as Lin also emphasised, some of the relationships we
saw are important in a number of settings, both across the whole set of providers,
but also within one or more clusters, and clearly things like size, the
comprehensiveness that goes with size and with having a large postgraduate
population. The depth of expertise that we hope goes with having a high proportion
of senior academics, these are important and that I think is affirming to see. But also
the compositional aspects of the student body, a large number of external students,
part-time students, students admitted on the basis of VET. It’s very interesting to
understand those relationships between compositional factors for an institution and
the attrition rate at the institutional level. And as Lin also emphasised, it’s really
interesting to see what’s missing from this analysis. Things like median tertiary entry
scores, students staff ratios, satisfaction measures, employment outcomes, and
SES. Now, that’s not to say that those factors are not important at the individual
level, and I’ll say a bit more about that at the moment, but it is interesting that
in terms of composition, that they weren’t important
here. And of course, a really vital point is that the analysis that Lin presented
shows that there is important variation across the various clusters, so we have a
very rich level of heterogeneity, both in the
nature of the institutions in our sector, and in the ways in which attrition plays out
in those different parts of the sector. So, I’m going to focus the rest of my remarks
on where I think this should take us going further. And what immediately struck
me was how valuable it would be to put together the kind of institutional data set
that has allowed this analysis to take place with a student level data set so that we can
begin to try and understand attrition or success or completion in its full multi-level
complexity. And of course, when we think about attrition, it’s not hard to imagine
that both student level and institutional level
characteristics must be important to really understanding what’s going on in what is
fundamentally quite a complex process embedded within a multi-layered system. And if we look at the research literature
on attrition and degree completion, we find conceptual models that propose this and indeed
are supported by analyses in other countries. I couldn’t actually find any multi-level
analysis of Australian data, possibly because the datasets have not been combined
in a way that would allow that, but I do think that would be an interesting step
to take, but certainly in other countries, there are some interesting examples of the
kind of analysis I think it would be useful to do. The study I mentioned there by Chen
even incorporates longitudinal data of students over a six-year period trying to
understand the progressive loss of students through that process, either to degree completion
or leaving their degree. And that these sorts of … I have to figure
out how to go back. The sorts of conceptual models that I think it would be
very interesting to explore are illustrated by this one from 2009. Some US authors, who
have aimed to model attrition, both at the student level and institutional levels
simultaneously, which is what a multi-level model would allow. At the individual student
level, characteristics of the students like their backgrounds, their experiences, their
attitudes and indeed, features in their own local setting, for example, financial considerations,
the opportunities for employment. All of those might impact on their individual
probability to actually remain in study and go on to complete their degrees. And in addition
to that, there’s an institutional propensity, or impact on that probability,
which is then in turn at another level, modelled in terms of compositional and structural
characteristics of the institutions themselves, including potentially some aspects
of the climate of those institutions that might be measured by various ways in
which students express their intention to stay, or they may reflect staff views about
the importance of teaching in an institution or the importance of retention and completion
of degrees by students in that environment. And then that those two levels then constitute
a single model which allows us to unpack both the individual impacts on a student
of particular characteristics, as well as the compositional characteristics of that
student cohort on the institutional level effects. And here’s an example of the findings
of a study of that kind. Quite a large study: 37,000 students from 170 four-year
colleges and universities in the US using a similar kind of methodology at one level,
relying on national surveys that are completed by students in those institutions
on a sample basis, and then looking at whether or not students have completed after
six years, plus integrating into that, some data about the institution from other
sources, in this case from a survey of the staff of those institutions. And in a multi-level model, finding that,
certainly a lot of the variation is explained by
student level effects and student compositional effects. In that case, many of the
similar things that we would expect to be important here, but then in addition to that,
the capability of showing that actually some of the sort of structural characteristics
of the institution’s matter, the perceived climate
in relation to a culture of educational excellence matters a little bit, but in this
large study, enough to make a difference. And then some aspects of climate of the institution
which students themselves can report through their annual reporting of their
expectations of whether or not they will complete. And so together, that I think, illustrates
the kind of analysis that we might hope to go on to next where we can look at
both student and institutional level impacts. And then one last couple of points. The next
thing I would add is that in fact, Lin�s study, or this study has shown how important
institutional heterogeneity is, but when we look inside individual institutions, they’re
not all homogeneous either, and I think many of us, where we look carefully, we find
variations that are quite substantial from one program to another. And so, I think we
can view some of the same methods that I’ve just described to actually understand
heterogeneity within institutions much more clearly and therefore, in ways which will
allow us to act more purposefully and helpfully. The third point I’d make then is that it’s
really interesting once we have some sense of these baseline models about student characteristics,
compositional characteristics, broad characteristics of
an institution. What can we do as institutions to then drive degree completion to higher
rates and reduce attrition in the process? And here, I think, there are a lot of very
promising and really quite exciting initiatives under way, both here and in other parts of
the world. I was going to just mention these two but I�ll add a third. One is,
I think, many institutions are making much better use of data to understand the early
signs of an intention to abandon study and are actually ensuring that they’re providing
the support they can to ensure that those sorts of decisions are made in ways that really
are best for individual students. I do remember a student giving me her business
card many years ago as she left halfway through first year to start a very successful
business. Some attrition, I think, does come because people really do have very attractive
alternatives in front of them at a particular moment in time. And so, I think the more we actually understand
the broader picture, the more carefully we can appraise these kinds of initiatives,
and here I’ve mentioned the student relationship engagement system which
was developed at the University of Sydney and is being used in a couple of different
places, but there are many different examples of that kind. And then the second
is some very exciting brief values affirmation interventions that have been developed
by Geoffrey Cohen and his colleagues at Stanford University which are
having quite dramatic effects used in appropriate ways to reduce the achievement
difference between students who come from cultural minority groups into a majority
culture setting. And I think, again, that that’s an example of something which we could
evaluate with much better data on what other factors need to be understood in
order to evaluate that effect. And then finally and not mentioned here on
the slide of course, I think we’re all thinking about ways of presenting information
to students in ways that will help them make better choices at each step along the
way. And I think government’s thinking about that, everyone is thinking about that.
I think the more we can evaluate attempts to present information in ways that
allow us to actually assess their impact, again, the more quickly, we�ll hone in on
some really effective strategies. So let me finish by again thanking TEQSA for the report.
We’ve been very interested in my institution to discuss it and I’m sure that’s
true around the sector, and I think that probably explains the large number of you
that are here today. So thank you. [Applause]>>ANTHONY MCCLARAN: Thank you very much indeed
Pip. I would like now to ask Pip if you could take one of those chairs.
Lin, and also Nick Saunders are going to take the chairs at the front there. I will
attempt to deal with the questions from here I think because it�s far easier to see hands
going up from this position. This is an opportunity for questions, either to Lin about
the research, to Pip about that very productive set of responses and suggestions
and pathways for future work, and to Nick in terms of the view of TEQSA. We’ve
stressed already the view of TEQSA, the starting point around risk assessment, but
of course there�s another aspect which is our wish to work with the sector, to point
to and to encourage and to support some of those instances that I think Pip was referring
to, of good practice. What are the interventions that can actually make a positive
impact upon attrition, issues of attrition. And thanks also Pip for bringing
into the frame right at the end there, the link
between attrition and a much wider series of questions which I guess I just very
slightly touched on in my introduction around student decision-making and that, of
course, links into the whole agenda around transparency and admissions
transparency, which is a particular preoccupation for TEQSA and many of us in the
sector at the moment. So, let me pause and throw this open to the
floor and ask you if you have any questions. I see one immediately. A roving
mic will find you I think, just on the outside and then the next question is just
in front. The microphone will find you. Raph, just on the outside there, and if you
could say who you are and where you’re from, that would be helpful.>>ATTENDEE 1: Victoria University. Question
for Lin. Having done a very similar analysis myself
two years ago, with 5 years of data, comparing your results to mine raises a number
of very interesting issues. The first is the idea that you touched on, having only
university data, I could differentiate types of attrition, and so I did my analysis using
attrition in the form of lost to sector, and attrition in the form of lost to other universities
as two separate analyses. And looking at my results for that, it highlights one
of the issues that might arise in your analysis. In doing that, I find that I get variables
that act in opposite directions. The same variables, acts in opposite directions on
two different types of truth. A perfect example is regional. Being regional tends
to have a bad effect in terms of loss to sector. But it tends to have a good effect
in terms of loss to other university for fairly obvious reasons. And so, by using the single
measure, you are actually perhaps getting variables coming up as not significant
because they have in effect, opposite effects of the two types of attrition. So,
that�s the first issue that comes out of looking
at my analysis. The second is one that’s surprising. You say
that you looked at ATAR. I used as one of my variables the average ATAR of domestic
students admitted to each university. Commencing students. Using just that variable,
I can explain 75% of variation of lost to sector attrition. So I am stunned to hear
that you could not find any effect at all on
ATAR, even on loss to other Universities, I can get an R square of 55% just with that
variable. And so certainly to me, it appeared to be a very major factor, in fact, the
most significant factor in my analysis. So, I’m very interested to hear exactly what
measure you used for ATAR.>>LIN MARTIN: I think Pip in fact used � it
was the proportion of students admitted�the median, the median ATAR. I think, the reason that some of these patterns
don�t come up too is because there is that enormous diversity when you’re looking
at 130 institutions with quite different profiles. And a lot of the private providers
only have or the vast majority of their students would be international students,
and so I�m not surprised. Sometimes, we don�t even get ATAR scores to look at so
I�m not surprised. When you look at the
whole sector, it doesn’t come up so finely defined as your study has. But again, as
that chart at the end showed, if you do get patterns coming in that use the same
variables that are reflected in different ways, some positive and some negative. So,
I�m not surprised that you got results like you did. But I think you can�t underestimate
the variety and that this is a multivariate analysis, not a univariate analysis. You�re
just looking at one variable.>>ANTHONY MCCLARAN: Thanks Lin. I’m anxious
to get more questions in. There was one just in front I think.>>ATTENDEE 2: Thank you very much. Great
presentation, very exciting. A couple of interesting points. Failure rates and attrition
rates. Has there been any analysis with regards to subject failure rates and
attrition rates? The two correspond. Is there any relationship between the two. Secondly,
with regards to the VET students articulating into higher education, that’s
been very problematic, especially working in
the VET side because a lot of our students take the curriculum side. And I noticed
when I did the analysis, on a student level, that most of our students, who actually
had advanced standing, were the ones who had the greatest problems. Going back many, many years, prior to training
packages, where the courses at TAFE institutes were not really curriculum-based,
a lot of our students actually ended up going through to university quite successfully.
I�m just wondering if there�s been any analysis done on the impact of training
packages being delivered at the AQF 6 and AQF 5 level and the advanced standing
and the changes in the way that those training packages have been implemented and
what effect does that have on the attrition rates?>>LIN MARTIN: Well I’m not sure that I can
answer a lot of that but I think the first point you talked about was, was there any
relationship between sort of passing within a year and so forth, and that’s an
interesting thing because we would expect, I
suppose just thinking from first principles, that if you had a high, what we call
progress rate, which is really a pass rate within a year. If you had higher pass rates
within the year, you wouldn�t expect to have high attrition and vice versa. And in
fact, one of the clusters that I put up in that
chart and it’s in the report. I think it was cluster
3 which was the medium sized, international business type area. That is the only one
of the 4 clusters that the lower the progress rate, the lower the within year pass rate,
the higher the attrition of the 4 clusters. And that was the only one of the 4 clusters.
So, that is the only thing we have on that. As for the detailed information on the
training packages and so forth, we haven’t done anything on that, though we do
keep in close contact with the VET providers because we have a number of dual
sector providers where those are issues of interest. This is the first of the occasional
series and that�s about the only thing that I can say.>>ANTHONY MCCLARAN: Pip, is there anything
you want to say on the training packages?>>PIP PATTERSON: No, I�m not really able
to say anything but I do think the fact that you�ve asked those questions in this
setting, is that it�s such a wonderful demonstration of the value of doing some analysis
which invites the next set of questions which invites the next set of analyses.>>ANTHONY MCCLARAN: Thank you very much.
I’d like to take some more questions please. Hands up. Do we have any
more? Yes, there�s one there.>>ATTENDEE 3: College of Divinity. I see
on page 30 the comment in the report, that looking to the future, the strict definition
of attrition needs review because of the changing nature of various factors in the
sector, and we found it to be a very helpful report, helping us to position ourselves to
see where we are, so we are certainly very appreciative of this very thoughtful work
that�s gone into this project. Thank very much for being involved in that. But one of
the most obvious things that would seem to me would be right across the set and not
just cluster 2, it�s got some particular things in it. [1:04:00 Inaudible] �nested
award challenges. You can�t win if you enrol someone in your first-year Bachelor level
diploma and to see if they�re alright and don�t become attrition. If you enrol them
in the Bachelor and they really can�t make it,
and have to exit and I would�ve thought the whole sector would be interested to see
perhaps something done about that.>>ANTHONY MCCLARAN: Okay thank you. I’m going
to direct that one to Nick, Nick, definitions of attrition and how they�re
currently used and how they might be used in the future.>>NICK SAUNDERS: Well, it�s a very important
point that goes beyond, I think the technical definitions of institutions that
runs three trimesters and whether the students are completing in a calendar year
or not which does cause problems in the calculations. But as we look to the future,
of course, there�s going to be more and more students undertaking what is now being
called micro credentialing, industry based or authentic learning. They�ll be
bundling those credentials up in various ways, they�ll be dipping in and out of their education
as their workplace requires them to develop new knowledge and skills. And then,
how do you actually measure, in fact what is the meaning of attrition in that context,
I think, is a very important point. And then the second point you make about trying
to do the best for the students and being punished for it. I mean I think that�s
also a very important point to keep in consideration. TEQSA�s been brave enough
recently to say that a reasonable benchmark will be completion rates of 80%
in n plus two years. So, in a Bachelor degree, 80% completion rate in five years,
that would be a reasonable benchmark. That’s like the UK and the US is achieving.
However having said that, why should we punish an institution who is deliberately
taking in lesser academically qualified students according to the measures of, say
a high school, and try and give them an opportunity to actually complete a higher
education degree. So there is no magic number. I think it’s important, I think, the
most important thing is for institutions to be
aware of their students� progress, of their students� attrition, taking some efforts
to measure what is happening and why it�s happening,
and then putting in place appropriate support mechanisms to promote
the chances of the students succeeding. I mean, I think that’s a very
important point that�s being made.>>ANTHONY MCCLARAN: It is. Pip, I think you’d
like to comment.>>PIP PATTERSON: Yes, thank you. I just wanted
to thank you for the question. I just wanted to say that the more you think
about questions like this, the more we realize that our students are on quite complex
and diverse trajectories in education. They enter at different points, they dip in
and out in different ways, and I think to really understand what’s going on, we probably
need to be able to understand those trajectories in a much richer way, just as
we�re starting to understand the richness of
the institutional setting, we need to understand the trajectories over time that our
students take.>>ANTHONY MCCLARAN: Thank you. Yes, question
on the aisle here.>>ATTENDEE 4: It will be loud enough anyhow.>>ANTHONY MCCLARAN: It helps with the recording
as well, thank you. Ok.>>ATTENDEE 4: Can I just continue the debate
about attrition because in your introductory remarks and also in the report,
it talks about attrition in terms of individual students, of wasted time and it
talks about financial burden. I guess the question you would ask is, is one year of
higher education better than none? We may all agree that three years is better than
one but is it equally true that one is better than none? In at least three ways.
The first way is a student who only takes one year, over their lifetime, is their income
greater than someone who hasn�t had higher education? Number two. We happen to
believe about values being given to people in education in terms of problems solving
and critical analysis, so they gain that in one year. And I guess we also believe
that education has got some public good aspect to it so is it better to society
where at least part of the population has one year�s education at a higher level than
nothing. So, attrition may be bad but it�s not all bad.>>ANTHONY MCCLARAN: Thank you very much for
that question. Now that raises really important issues about recognition
of learning that has been accomplished, even when a complete program has not been
completed. So, I’d like to ask the panel for comments on that. The terminology, and
I guess I ought to start with you Lin, as there was a quote from the report there. The
terminology of waste and burden is perhaps not giving us the right perspective
for looking at this issue.>>LIN MARTIN: Yes, I mean, point taken. But
I guess for the vast majority of the years that there has been interest in the
public domain, in attrition, it has been using that sort of language, but of course, I’ve
spent a lot of my life working in universities, in fact 8 in Australia in my long administrative
career. And these have been a range of universities: new ones, ones that are having
difficulty perhaps filling quotas, and so forth as well as the Group of Eight. And
I agree with you that in a lot of those institutions, I don’t mean to ghettoise them,
but in a lot of those institutions, seeing the benefits that come from students who probably
didn�t ever think that they would ever come to university, and this was all
about universities, I apologize to the private providers about that. But I do think there
are great benefits and my own view, I would really like us to in parallel, and if the
Minister�s thinking about his performance fund,
to actually try and get a bit better fix on what students have actually learned. So to
try and understand that even if they leave early, what are the skills and the attributes
of the learning outcomes that they have been working towards in various subjects
and so forth in their course of study. And I wouldn�t like to think that the only
performance issue was getting that attrition rate down, and I’d like to get some more
qualitative information as well as the quantitative to back up whether it is a loss or,
least if it�s not a greater loss that it appears to from the raw numbers.>>ANTHONY MCCLARAN: I’m just going to ask
Pip if she wants to comment about whether a student, who leaves Sydney after
a year, has wasted their time or money. That�s not loaded in any way whatsoever.
Then I’m going to ask Nick for any final responses or reflections he has on the points
that have been made and then we will draw the formal part of the proceedings to
a close. You may be pleased to hear that that we would love you to join us for a drink
and the informal discussion can continue. Drinks are, even as I speak, set
out at the back of the room, but Pip, first, then Nick and then we�ll close the proceedings.
Thanks.>>PIP PATTERSON: Well briefly, I�m going
to dodge that question but I have seen some research that actually shows some benefits.
I think in terms of earnings in later life from partial completion of tertiary study.
And I think it is fundamentally an empirical question. Of course, we also see
some cases of students who can be quite distressed by failure and for whom I don�t
think it�s a good experience. So, I think there are both probably positive and negative
impacts. I do agree with Lin that we as a sector do need to get a good handle on what
our students are learning and at Sydney, we�ve just commenced a major project
to actually assess learning outcomes across the institution using a common framework
as well as the normal assessments that students do, about their knowledge and
skills. So, I think it will be interesting to
see to what extent those approaches emerge over time.>>ANTHONY MCCLARAN: Thank you. Nick, the
report was not the end of this matter for TEQSA but very much a beginning,
so any reflections on what you�ve heard so far.>>NICK SAUNDERS: Yes, well I think what we’ve
heard today really shows the power of having data to examine and have a
conversation about, the different ways in which that data can be interrogated and
the different perspectives that can be brought to bear on the findings. The complexity
of the issue and the difficulties of taking a simplistic view of a matter as complex
as student success or attrition, and indeed then try to translate that into any
sort of dollar and cents outcome. I think it’s
really important that the sector has a broad discussion about this and there are
learnings to be made, both from the most successful universities through to small
private providers, and somehow or other, we’ve got to get that conversation going.
So, it’s not all just about what’s going on in universities. So Anthony I�d like to
thank you for encouraging us to put on this first
occasional series. We will be having other occasional series forums in the months to
come. We’d be interested to hear from the sector the sorts of issues that you would
want to bring forward as well as us thinking up things to bring to you. Thanks to Lin and
Alonso for driving the project at TEQSA and the staff that assisted there with the
information and the analysis. Thanks to Pip for giving a very thoughtful and helpful response
to the findings. Thanks to Engineers Australia, particularly to the IT engineer,
who successfully remedied the difficulties that we were facing. Thanks to Karen Treloar
and the staff of TEQSA who has organized tonight. These things just don�t
happen and we do appreciate the hard work of the staff in doing this. And thanks
to you all for coming to Melbourne and participating in this discussion, and we really
do encourage engagement between those who are providing higher education in
Australia with us the regulator. Thanks very much to you all.>>ANTHONY MCCLARAN: Thanks Nick. [Applause]>>ANTHONY MCCLARAN: And again, thank you
all very much and let the discussion now continue informally over drinks
at the back of the room. Thank you.

Danny Hutson

Leave a Reply

Your email address will not be published. Required fields are marked *