Pages

Wednesday, July 31, 2013

Some Recent, and Transparently Applicable, Results in Time-Series Econometrics


I think most of us would agree that when new techniques are introduced in econometrics, it's often a bit of a challenge to see exactly what would be involved in applying them. Someone comes up with a new estimator or test, and it's often a while before it gets incorporated into our favourite econometrics package, or until someone puts together an expository piece that illustrates, in simple terms, how to put the theory into practice.

In part, that's why applied econometrics "lags behind" econometric theory. Another reason is that a lot of practitioners aren't interested in reading the latest theoretical paper themselves.

Fair enough!

In any event, it's always refreshing when new inferential procedures are introduced into the literature in a way that exhibits a decent degree of "transparency" with respect to their actual application. For those of you who like you keep up with recent developments in time-series econometrics, here are some good examples of recent papers that (in my view) score well on the "transparency index":

Tuesday, July 30, 2013

Francis Diebold on GMM

On his blog, No Hesitations, Francis Diebold has two recent posts about GMM estimation that students of econometrics, and practitioners, definitely should read.

The first of these posts is here, and the second follow-up post is here.

Enjoy!

© 2013, David E. Giles

Monday, July 29, 2013

Recent, and Recommended.......

Recently, I griped posted about the need to get the economics back into papers that the authors characterize as "microeconometrics". Although I was venting (just a little!) about the "disconnect" that we so often see, between the theory section and the empirical section, in so many of the papers in this category, I also commented that there are plenty of papers out there that avoid this disconnect. I just wish there were more of them!

In response to one of the comments of that post, I gave just one such example, and afterwards I thought that although my choice was a good one, it was somewhat dated. So, on a more positive note, what about some recent papers that exemplify what I'm looking for, and what I'd like to see more of?

Wednesday, July 24, 2013

Information Criteria Unveiled

Most of you will have used, or at least encountered, various "information criteria" when estimating a regression model, an ARIMA model, or a VAR model. These criteria provide us with a way of comparing alternative model specifications, and selecting between them. 

They're not test statistics. Rather, they're minus twice the maximized value of the underlying log-likelihood function, adjusted by a "penalty factor" that depends on the number of parameters being estimated. The more parameters, the more complicated is the model, and the greater the penalty factor. For a given level of "fit", a more parsimonious model is rewarded more than a more complex model. Changing the exact form of the penalty factor gives rise to a different information criterion.

However, did you ever stop to ask "why are these called information criteria?" Did you realize that these criteria - which are, after all, statistics - have different properties when it comes to the probability that they will select the correct model specification? In this respect, they are typically biased, and some of them are even inconsistent.

This sounds like something that's worth knowing more about!

Monday, July 22, 2013

Former Students

It's always great to catch up with former students. I guess this is especially true of grad. students because (inevitably) you get to spend a lot of one-on-one time with them, and get to know them pretty well.

So, today I was thrilled to meet up with not one, but three, former grad. students! Peter Jacobsen, Cameron Woodbridge, and William Bi are all working for the B.C. Ministry of Forests, Lands and Natural Resource Operations, here in Victoria. I've known them all for quite some time. In fact, Peter was only the second M.A. student I supervised after I moved to UVic in 1994. 

Great lunch, guys!


© 203, David E. Giles

Friday, July 19, 2013

Some Current Projects

I often get emails asking me what research projects I'm working on. Generally, I have several projects underway at any given time - usually at various stages of development or completion. In that respect I guess I'm pretty typical.

I also tend to have a mixture of theoretical and applied projects, some econometric and some essentially statistical in nature. I find that this provides some continuity in my work. It's not easy to focus on just one or two research projects all of the time, especially if they're not progressing as well as you'd like them to!

So, what am I up to right now? Here are some of the papers/projects that I'm working on:

Sunday, July 14, 2013

Vintage Years in Econometrics - The 1950's

Following on from my earlier posts about vintage years for econometrics in the 1930's and 1940's, here's my run-down on the 1950's.

As before, let me note that "in econometrics, what constitutes quality and importance is partly a matter of taste - just like wine! So, not all of you will agree with the choices I've made in the following compilation."

Thursday, July 11, 2013

Let's Put the "ECON" Back Into Microeconometrics

You just couldn't resist the title, could you?

Don't worry, I'm not going to be too harsh. After all, I'm rather fond of those who practise "applied microeconometrics" - especially lightly sautéed, with a little pepper and garlic. Sorry! Sorry!

The point that I want to make is a simple one, and I'll be brief.

How many seminars have you attended where the speaker has gone through the details of a formal microeconomic model, and then proceeded to a potentially interesting empirical application? And in how many cases was there a total "disconnect" between the theoretical model and the empirical model?

Hand up! Don't be shy! Wow - that's almost everyone!

Wednesday, July 10, 2013

Conference and Seminar Papers - From Both Sides of the Podium

Here are three somewhat related posts from Rob Hyndman, Professor of Statistics in the Department of Econometrics and Business Statistics at Monash University. Rob blogs at Hyndsight.

I liked them all, and I commend them to you:
  1. Giving a Research Seminar (2008)
  2. Attending Research Seminars (2009)
  3. Asking Good Questions (2013) - a guest post by Eran Raviv
For my own take on these matters, you might want to check here, here, and here.


© 2013, David E. Giles

Monday, July 8, 2013

Conference on Computing in Economics & Finance

The 19th. International Conference on Computing in Economics and Finance is coming up in Vancouver, B.C., this week. There's an interesting program.

One of my M.A. students, Yanan Li, and I have a paper that Yanan will be presenting at the conference. 

The paper is titled "Modeling Volatility Effects Between Emerging Asian Stock Markets and Developed Stock Markets".


© 2013, David E. Giles

Sunday, July 7, 2013

Happy 40th!

I'm not sure if this is a good day or a bad day! I was feeling fine, and then this reminder of my advancing age arrived in the mail.........


Has it really been that long?


© 2013, David E. Giles

Saturday, July 6, 2013

Musical Econometrics

There's a recent paper by McIntosh et al. (2013), titled "Listen to Your Data: Econometric Model Specification Through Sonification" that I really enjoyed. It's thought-provoking and innovative, and it takes things a lot further than what I mentioned in a previous post on data sonification.

Here's the abstract:

Friday, July 5, 2013

Allocation Models With Bounded Dependent Variables

My post yesterday, on Allocation Models, drew a comment to the effect that in such models the dependent variables take values that must to be non-negative fractions. Well, as I responded, that's true sometimes (e.g., in the case of market shares); but not in other cases- such as the Engel curve example that I mentioned in the post.

The anonymous comment was rather terse, but I'm presuming that the point that was intended is that if the y variables have to be positive fractions, we wouldn't want to use OLS. Ideally, that's so. Of course, we could use OLS and then check that all of the within-sample predicted values are between zero and one. Better still, we could use a more suitable estimator - one that takes the restriction on the data values into account.

The obvious solution is to assume that the errors, and hence the y values, follow a Beta distribution, and then estimate the equations by MLE. As I noted in my response to the comment, the "adding up" restictions that are needed on the parameters will be satisfied automatically, just as they are under OLS estimation.

Here's a demonstration of this.

Paper With Jacob Schwartz

It was nice to get the final "acceptance" yesterday for a paper co-authored with former grad. student, Jacob Schwartz.

The paper, titled "Bias-Reduced Maximum Likelihood Estimation of the Zero-Inflated Poisson Distribution", and with Jacob as lead author, will appear in Communications in Statistics - Theory & Methods. You can download a copy of the paper from here.

Jacob has been in the Ph.D. program at UBC for a while now. It seems quieter around the computing lab. without him!


© 2013, David E. Giles

Thursday, July 4, 2013

Allocation Models

An "allocation model" is a special type of multi-equation model that has some interesting properties. This type of model arises quite frequently in applied econometrics, and it's worth knowing about it. In this post I'll explain what an allocation model is, and explore some of the estimation results that arise.

Wednesday, July 3, 2013

Ms DOS

They say that, with children, it doesn't get "easier", it just gets "different". Well, I'm not so sure. I have four grown-up "children" - they're are all successfully following their dreams, and I couldn't be happier!

So, I hope that Emma doesn't mind if I share this story from the mid 1990's, when her age was in single digits.

The Adjusted R-Squared, Again

In an earlier post about the adjusted coefficient of determination, RA2, I mentioned the following results that a lot of students don't seem to be aware of, in the context of a linear regression model estimated by OLS:

  1. Adding a regressor will increase (decrease) RA2 depending on whether the absolute value of the t-statistic associated with that regressor is greater (less) than one in value. RA2 is unchanged if that absolute t-statistic is exactly equal to one. If you drop a regressor from the model, the converse of the above result applies.
  2. Adding a group of regressors to the model will increase (decrease) RA2 depending on whether the F-statistic for testing that their coefficients are all zero is greater (less) than one in value. RA2 is unchanged if that  F-statistic is exactly equal to one. If you drop a group of regressors from the model, the converse of the above result applies.
The first of these results is (effectively) stated as Therorem 3.1 in Greene (2012), but the proof is left as an exercise.

In a comment on my previous  post, I was asked if I could supply simple proofs of these results.


Connections Between Univariate Distributions

I've been enjoying Francis X. Diebold's blog, No Hesitations. The other day he had a nice post on statistical graphics, and I found myself nodding (affirmatively) as I read through it. I won't repeat his points here, save to say:
  • I, too, am a great fan of Edward Tufte. I have a couple of his booksand I used to use Minard's Napoleon chart in my introductory descriptive statistics courses.
  • I have a copy of the chart of univariate statistical distribution relationships (Leemis et al., 2008) on my office wall. I was delighted to learn, from Francis's blog, that an interactive version of this chart is available.
The interactive version is definitely worth taking a look at.


© 2013, David E. Giles

Tuesday, July 2, 2013

Summer Reading

The schools are out, and here in Canada we celebrated Canada Day yesterday. That means it's now summer! And summer means summer reading.

So, here are some suggestions for you:
  • Andreou, E., E. Ghysels, and A. Kourtellos, 2013. Should macroeconomic forecasters use daily financial data and how? Journal of Business and Economic Statistics, 31, 240-251.
  • Downey, A. B., 2013. Think Bayes: Bayesian Statistics Made Simple. Green Tea Press, Needham MA.
  • Espejo, M. R., M. D. Pineda, and S. Nadarajah, 2013. Optimal unbiased estimation of some population central moments. Metron, 71, 39-62.
  • Giacomini, R., D. M. Politis, and H. White, 2013. A warp-speed method for conducting Monte Carlo experiments involving bootstrap estimators. Econometric Theory, 29, 567-589.
  • Hayter, A. J., 2013. A new procedure for the Behrens-Fisher problem that guarantees confidence levels. Journal of Statistical Theory and Practice, 7, 515-536.
  • Ouysse, R., 2013. Forecasting using a large number of predictors: Bayesian model averaging versus principal components regression. Australian School of Business Working Paper 2013 ECON 04, University of New South Wales.
  • Pinkse, J., 2013. The ET interview: Herman Bierens. Econometric Theory, 29, 590-608.
  • Stigler, S. M., 2007.  The epic story of maximum likelihood. Statistical Science, 22, 598-620.
  • Yu, P., 2013. Inconsistency of 2SLS estimators in threshold regression with endogeneity. Economics Letters, in press.

© 2013, David E. Giles

N.Z. Association of Economists Conference

Although it's still the afternoon of Tuesday 2 July here on the We(s)t Coast, it's already the morning of Wednesday 3 July in New Zealand. That being the case, the 54th Annual Conference of the New Zealand Association of Economists is just getting underway in Wellington. Although I'm not attending, I do have a soft-spot for this conference, and I'll be participating next year.

The conference program includes a number of interesting looking empirical papers, and as usual there is a strong emphasis on economic policy analysis.

The other reason for my interest in this conference? The first conference paper I ever presented was at the 1972 NZAE Conference, held at Massey University in Palmerston North. I talked about "Consumption Expenditure in New Zealand". How time flies!


© 2013, David E. Giles

Monday, July 1, 2013

Congratulations, Graham Voss!

Congratulations to my departmental colleague, Graham Voss, whose promotion to full Professor takes effect today!

Graham describes his research interests as: "Applied macroeconomics with a focus on monetary and fiscal policies and exchange rates". He's a very accomplished empirical macroeconomist, with extensive experience at The Reserve Bank of Australia (their central bank) to complement his academic contributions.

Here's his webpage.



© 2013, David E. Giles

The Bootstrap - A Non-Technical Introduction

Computer-intensive methods have become essential to much of statistical analysis, and that includes econometrics. Think of Monte Carlo simulations, MCMC for Bayesian methods, maximum simulated likelihood, empirical likelihood methods, the jackknife, and (of course) the bootstrap.

Although we usually date the bootstrap from Bradley Efron's 1979 paper, as a resampling method it has its roots in earlier, related, contributions including those of Quenouille (1949, 1956).

The main purpose of this post is to draw readers' attention to the piece by Diaconis and Efron (1983) that appeared in Scientific American. It's written for a "general audience", which is nice, and it also provides an interesting snapshot of what was cutting-edge computing 30 years ago. The discussion paper version of the article (including typos) is available here.

As a final bonus, the examples include one from econometrics!


References

Diaconis and B. Efron, 1983. Computer intensive methods in statistics. Scientific American, 248, 116-132.

Efron, B., 1979. Bootstrap methods: Another look at the jackknife. Annals of Statistics, 7, 1-26.

Quenouille, M. H.,1949. Approximate tests of correlation in time series. Journal of the Royal
Statistical Society, Series B, 11, 18-44.

Quenouille, M. H.,1956. Notes on bias in estimation. Biometrika, 61, 353-360.


© 2013, David E. Giles