Jump to main content.

THIS IS AN ARCHIVED SITE

This site contains information that has been considered archived and will no longer be updated. Please click here to go to the CURRENT eda.gov website.

A bureau within the U.S. Department of Commerce
Newsroom

Newsroom

Q&A: Roland Stephan, Associate Director, Higher Education & Economic Development, Center for Science, Technology & Economic Development, SRI International

Caption below

Roland Stephan, Associate Director, Higher Education & Economic Development, Center for Science, Technology & Economic Development, SRI International

Roland Stephen is the Associate Director for Higher Education & Economic Development in the Center for Science, Technology & Economic Development at SRI International. He is presently a co-leader on four studies for the Economic Development Administration, Department of Commerce.  He has worked on performance assessment in higher education and on regional economic development strategies. Prior to joining SRI, Dr. Stephen had seven years of leadership experience at the Institute for Emerging Issues (IEI), an applied policy unit at North Carolina State University, where he was also an associate professor in the School of Public and International Affairs. He has directed policy research, strategic planning, program design and public engagement in the areas of technology policy, higher education policy, and regional economic development. Dr. Stephen received his Ph.D. in International and Comparative Political Economy from UCLA and his B.A. in Economics and History from Cambridge University, United Kingdom.

Q1. SRI worked on an EDA University Center Best Practices Report. Can you talk a little bit about the goals of the report? How did data factor in? What data did you collect? What were the methods?

The overarching goal of the report was to review and inventory the various attributes and activities of EDA’s existing university centers (UCs) and identify the best practices that emerged from our review. We quickly found out that the 58 UCs were very, very different along every dimension – goals, activities, duration, funding, location, etc. Data of course would underlay our analysis, but because of the wide variation among Centers, it would be impossible for us to systematically collect the same quantitative information and rigorously determine a “best practice.” Therefore we approached the evaluation using mixed-methods – collecting quantitative information where we could, but relying on a qualitatively rich understanding of the UC to develop some “best practices.”

While EDA tracked the progress and deliverables of each individual UC, at the time there was no centralized collection of information on UC activities, so much of our effort was spent collecting data on UCs. Data ranged from institutional characteristics such as where in the university a UC is situated to details about activities and clients, and, of course, outcomes of centers and what centers themselves considered to be their best practices.

Because these programs are complex and nuanced, we gathered data three different ways – first we surveyed the UCs for basic information, then we visited all but one of the EDA regional offices to talk with staff and to obtain applications and reports.  (We interviewed the Seattle office over the phone.) Based on the survey and regional staff member input we visited eight UCs spread all over the country.  We talked with UC directors and staff and interviewed clients and stakeholders. Based on these visits we developed an interview protocol and we then interviewed every UC on the phone.  Finally, we used a web survey to collect client and stakeholder feedback.

Q2. Building capacity through data analysis and GIS tools is one of the topics in the evaluation. Can you talk a little bit about the relationship between capacity building and data analysis?

Many public officials and economic developers do not have the capacity in terms of skill sets, or tools, or access to data to perform the economic analysis and strategic planning needed to understand and shape the economic development of their region. Economic development planning hinges on data – for example, data on population, workforce, industrial clusters, consumption patterns, residential profiles, etc. Much of these data are not available at the local level and are difficult for small communities and organizations to analyze. The University Center grant allows university centers to provide capacity to regions in support of community and regional economic development planning efforts. For example, many university centers have access togeographic information systems (GIS), which provide a geographic representation of data of interest. These maps enable economic developers to present economic data in a geographic space to support their activities.   Alternatively (and sometime complementary), university centers provide access to data and training in analysis so economic development practitioners can do their own analysis.

Q3. How else are University Centers using data with regard to economic development? Are there any practices that might translate well to other economic development organizations or entities looking to do economic development?

Many Centers analyze the usual group of economic and workforce indicators needed in strategic planning; however, a few other centers are using data in other interesting ways. The University Center at Kansas State University, hosted by the Advanced Manufacturing Institute (AMI), established the Kansas Opportunity Innovation Network (KOIN). Its mission is to enhance the global competitiveness of rural businesses by providing access to innovative ideas, new markets, expertise, capital, and collaborations, independent of close geographical proximity. KOIN has developed new regional innovation tools and uses this knowledge to support local and regional businesses through in-depth market analyses that complement the large-scale new product development services for which AMI is widely known. These complementary services allow AMI/KOIN to enhance the global competitiveness of rural/distressed companies and regions in Kansas.

In support of its mission, KOIN’s strategy includes profiling innovation competencies, assets, capabilities, and needs of regions, communities, and local companies to scout new opportunities, especially global opportunities, outside existing markets where clients may have little or no connections. KOIN also maps networks of technology providers, expertise, capital, and potential business partners that possess complementary competencies that can enable Center clients to respond in a competitive manner to readily connect and combine opportunities, companies, communities, and regions in innovative ways. KOIN’s work has included a quantitative key industry analysis and a quantitative regional innovation assessment for Kansas counties. One study mapped where workers reside versus where they are employed to illustrate regional inter-connectedness.  Another study produced unique data visualization of industry concentration including location, number of firms, employment, and sector.

Another view of data is internal data – the tracking of activities varies among university centers. A longitude tracking of clients and engagements not only helps a center report on their activity, it also can keep track of network participation. For example, the Center for Economic Development at California State University, Chico and California State University, Fresno uses an off-the-shelf Customer Relationship Management software to log every client interaction. All Center employees have been trained to include the details describing the interaction and the work product provided to the client. Analyzing client patterns has allowed the Center to predict when certain clients are likely to request specific kinds of data. It also provides an audit trail that is useful for problem solving.

Both of these practices could be translated to other economic development practitioners. It is important to know one’s region – it is important to get out and about to know one’s area, but it is also important to look at the data to find competencies and networks that you may not see. Regional and local data collected by third parties and the microdata collected by your staff may reveal a larger picture.