Problem, research strategy, and findings
Planners need to read plans to learn and adapt current practice. Planners may struggle to find time to read and study lengthy planning documents, especially in emerging areas such as climate change and urban resilience. Recently, natural language processing (NLP) has shown promise in processing big textual data. We asked whether planners could use NLP techniques to more efficiently extract useful and reliable information from planning documents. By analyzing 78 resilience plans from the 100 Resilient Cities Network, we found that results generated from topic modeling, which is an NLP technique, coincided to a large extent (80%) with those from the conventional content analysis approach. Topic modeling was generally effective and efficient in extracting the main information of plans, whereas the content analysis approach could find more in-depth details but at the expense of considerable time and effort. We further propose a transferrable model for cutting-edge planners to more efficiently read and study a large collection of plans using machine learning. Our methodology has limitations: Both topic modeling and content analysis can be subject to human bias and generate unreliable results; NLP text processing techniques may create inaccurate results due to their specific method limitations; and the transferable approach can be only applied to big textual data where there are enough sufficiently long documents.
Takeaway for practice
NLP represents a valuable addition to the planner’s toolbox. Topic modeling coupled with other NLP techniques can help planners to effectively discover key topics in plans, identify planning priorities and plans of specific emphasis, and find relevant policies.
Scopus Subject Areas
- Geography, Planning and Development
- Urban Studies
- machine learning
- natural language processing
- plan evaluation
- urban resilience