First, this question assumes that every university should have a "deep learning" person. Deep learning is mostly used in vision (and to a lesser extent NLP), and many universities don‘t have such researchers, so they wouldn‘t have a deep learning researcher either.
One thing that people often forget is that academics have long careers (thanks to tenure, this is by design). So if you hire a bunch of researchers now who do deep learning, they‘re going to be around for decades. Academia tends to be conservative, so it‘s not going to stock up on deep learning researchers just because it‘s cool today. If this were the norm, CS departments would be full of fuzzy logic researchers hired in the 90s.
There‘s nothing magical about deep learning. It‘s one tool of many (including Bayesian methods, discriminative methods, etc.) you should have in your toolbox. Departments try to hire bright people, not those who slavishly follow every fad. Obviously, there will be more of these people on faculties who do deep learning in the near future. (If Facebook, Google, and Baidu don‘t all hire them first, that is.)
That said, there are lots of folks working in this area. Of the schools mentioned in the question, Noah Smith at UW and Katrin Erk at Texas. Other places (off the top of my head) that work in this area: UMass, JHU, Maryland, NYU, Montreal, Michigan, and TTI. I‘m more upset that Princeton and Caltech (where I did my PhD and undergrad) don‘t have professors in CS who do language research. That‘s the bigger crime in my opinion, and is correlated with their lack of deep learning folks.
Blatant self-promotion ... Colorado has three folks working in this area: me, Mike Mozer, and Jim Martin.