-
Notifications
You must be signed in to change notification settings - Fork 0
/
blog.xml
2595 lines (2472 loc) · 160 KB
/
blog.xml
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:atom="http://www.w3.org/2005/Atom"
xmlns:media="http://search.yahoo.com/mrss/"
xmlns:content="http://purl.org/rss/1.0/modules/content/"
xmlns:dc="http://purl.org/dc/elements/1.1/"
version="2.0">
<channel>
<title>XP's Blog</title>
<link>https://xpsong.com/blog.html</link>
<atom:link href="https://xpsong.com/blog.xml" rel="self" type="application/rss+xml"/>
<description>Musings about data science, design, music and more...</description>

<generator>quarto-1.5.57</generator>
<lastBuildDate>Tue, 31 Oct 2023 16:00:00 GMT</lastBuildDate>
<item>
<title>New Online Course: Geospatial Data Science with R</title>
<dc:creator>XP Song</dc:creator>
<link>https://xpsong.com/posts/geospatial-datascience-r/</link>
<description><![CDATA[
<p>I’m very excited to announce that I have released a new online course ‘<a href="https://bit.ly/geospatial-datascience-r_udemy"><strong>Geospatial Data Science with R</strong></a>’ on Udemy!</p>
<p>As part of the opening offer, the <strong>first 100 sign-ups</strong> will have <strong>FREE lifetime access</strong> at the following link. Claim yours today: <a href="https://bit.ly/new-geospatial-course">bit.ly/new-geospatial-course</a></p>
<p>Check out the video trailer and see if the course is right for you:</p>
<div style="position: relative; padding-bottom: 56.25%; height: 0; overflow: hidden;">
<iframe src="https://www.youtube.com/embed/playlist?list=PLn_S1vjm5cr29NH4BwcgG-K31_VG6dgtm" style="position: absolute; top: 0; left: 0; width: 100%; height: 100%; border:0;" allowfullscreen="" title="Geospatial Data Science with R">
</iframe>
</div>
<p><br></p>
<p>What sets this course apart from typical data science offerings is its unique focus on spatial problems which offer a visually rich landscape for exploration and analysis. Whether you’re an absolute beginner or a seasoned professional, this course is designed for you to ground your understanding and gain practical skills that can be put into action immediately.</p>
<p>Finally, feel free to share this post with others that may benefit from the course. If you’ve signed up and have any feedback to offer, leaving a rating and review will also help me out a lot!</p>
<p>Happy learning!<br>
Xiao Ping</p>
<p><br></p>
<hr>
<p><br></p>
<center>
<section id="more-information" class="level3">
<h3 class="anchored" data-anchor-id="more-information">More information</h3>
<p><br></p>
<p><img src="https://xpsong.com/posts/geospatial-datascience-r/assets/2.jpg" width="80%"></p>
<p><img src="https://xpsong.com/posts/geospatial-datascience-r/assets/3.jpg" width="80%"></p>
<p><img src="https://xpsong.com/posts/geospatial-datascience-r/assets/4.jpg" width="80%"></p>
<p><img src="https://xpsong.com/posts/geospatial-datascience-r/assets/5.jpg" width="80%"></p>
<p><img src="https://xpsong.com/posts/geospatial-datascience-r/assets/6.jpg" width="80%"></p>
<p><img src="https://xpsong.com/posts/geospatial-datascience-r/assets/7.jpg" width="80%"></p>
<p><br></p>
</section></center>
<p><br></p>
<section id="links" class="level2">
<h2 class="anchored" data-anchor-id="links">Links</h2>
<p>➡ Go to course <a href="https://bit.ly/geospatial-datascience-r_udemy">here</a><br>
➡ Extended Content Preview of Week 1: <a href="https://bit.ly/geospatial-datascience-r_youtube-preview">Introduction to R Programming</a> (YouTube)<br>
➡ More information and updates on latest offers: <a href="https://xpsong.com/courses">xpsong.com/courses</a></p>
<p><br></p>
<p><br></p>
<p>This post is also shared on <a href="https://r-bloggers.com">R-bloggers.com</a>.</p>
</section>
]]></description>
<category>Tutorials</category>
<category>Software</category>
<category>R</category>
<guid>https://xpsong.com/posts/geospatial-datascience-r/</guid>
<pubDate>Tue, 31 Oct 2023 16:00:00 GMT</pubDate>
<media:content url="https://xpsong.com/posts/geospatial-datascience-r/featured.jpg" medium="image" type="image/jpeg"/>
</item>
<item>
<title>Biodiversity in cities: How can we assess the ‘performance’ of urban developments?</title>
<dc:creator>XP Song</dc:creator>
<link>https://xpsong.com/posts/biodivercity/</link>
<description><![CDATA[
<p><br></p>
<section id="current-industry-practices" class="level3">
<h3 class="anchored" data-anchor-id="current-industry-practices">Current industry practices</h3>
<p><a href="https://www.ura.gov.sg/Corporate/Planning/Our-Planning-Process/Bringing-plans-to-Reality/Environmental-Impact-Assessment">Environmental impact assessments</a> are commonly used to assess the impact of new urban developments on biodiversity, and serve to protect natural areas through legislation. In cities such as Singapore, independent consultancies often conduct the assessments and produce reports that are used as points of reference when engaging stakeholders (e.g., government agencies, nature groups) and the general public.</p>
<p>As a part of these assessments, on-site surveys of plants and animals are conducted within the area slated for development. These opportunistic records can provide a general picture of biodiversity and potential ways to mitigate environmental impact. However, the recommendations provided tend to be qualitative and too coarse for planning at fine spatial scales (see examples in Fig. 1). In current practice, there is no way to quantitatively scrutinise between proposed design scenarios, and to compare between present and future ‘performance’.</p>
<center>
<div class="cell" data-layout-align="center">
<div class="cell-output-display">
<div class="quarto-figure quarto-figure-center">
<figure class="figure">
<p><img src="https://xpsong.com/posts/biodivercity/assets/aecom-2022.png" class="img-fluid figure-img" style="width:100.0%"></p>
<figcaption>Figure 1: Recommendations to mitigate the impact of urban development on biodiversity in a consultancy report (Source: <a href="https://www.hdb.gov.sg/cs/infoweb/-/media/doc/RPG/Keppel-Club-Site-EIS-Report.ashx">AECOM, 2022</a>).</figcaption>
</figure>
</div>
</div>
</div>
</center>
<p><br></p>
<hr>
</section>
<section id="a-predictive-approach-is-needed" class="level3">
<h3 class="anchored" data-anchor-id="a-predictive-approach-is-needed">A predictive approach is needed</h3>
<p>Predictive spatial modelling can address the shortcomings of opportunistic sampling, and is able to assess biodiversity at spatial scales required to inform planning decisions (0.1–100 hectares). For example, <a href="https://besjournals.onlinelibrary.wiley.com/doi/full/10.1111/1365-2664.13782">species distribution modelling (SDM)</a><sup>1</sup> is a technique that can be used to predict the distribution of a species across geographic space and time, based on environmental conditions such as climate and the physical landscape (Fig. 2). Such SDM frameworks have the potential to provide much better rigour and spatial precision if integrated into the urban development process.</p>
<center>
<div class="cell" data-layout-align="center">
<div class="cell-output-display">
<div class="quarto-figure quarto-figure-center">
<figure class="figure">
<p><img src="https://xpsong.com/posts/biodivercity/assets/sdm.png" class="img-fluid figure-img" style="width:100.0%"></p>
<figcaption>Figure 2: How species distribution modelling (SDM) generally works (Source: <a href="https://damariszurell.github.io/SDM-Intro/">Damaris Zurell, 2020</a>).</figcaption>
</figure>
</div>
</div>
</div>
</center>
<p><br></p>
<p>One limitation of SDMs, however, is their focus on only a single species of interest. Information about the number of species (<em>species richness</em>) observed at a location are not considered. For instance, natural forests may support a lot more species (high <em>species richness</em>) each in greater numbers (high <em>species abundances</em>) compared to urban areas (see Fig. 3). However, SDMs would only convey the probability of occurrence for a chosen species (e.g., habitat suitability map in Fig. 2). While SDMs can be applied to a <a href="https://link.springer.com/chapter/10.1007/978-3-642-58001-7_11">‘keystone’ species</a><sup>2</sup>, there are challenges in identifying such species<sup>3</sup> and ensuring that the chosen species accurately represents the ‘total biodiversity’ of an urban area.</p>
<center>
<div class="cell" data-layout-align="center">
<div class="cell-output-display">
<div class="quarto-figure quarto-figure-center">
<figure class="figure">
<p><img src="https://xpsong.com/posts/biodivercity/assets/species-richness-vs-abundance.png" class="img-fluid figure-img" style="width:70.0%"></p>
<figcaption>Figure 3: Illustration showing the difference between <em>species richness</em> and <em>species abundance</em>.</figcaption>
</figure>
</div>
</div>
</div>
</center>
<p><br></p>
<p><em>Community structure</em> is another aspect of biodiversity that is still an active area of research, which SDMs do not represent well. While it is possible to combine <a href="https://besjournals.onlinelibrary.wiley.com/doi/full/10.1111/2041-210X.12841">multiple SDMs</a><sup>4</sup> to produce a community-level model, the diversity of species communities between different areas (<em>Beta</em> diversity; see Fig. 4) is not represented in such models. For example, two urban regions may both have the same total number of species (<em>Gamma</em> diversity; see Fig. 4), but the presence of large water bodies in one may result in diverse communities of water-loving species that can not be found elsewhere. SDMs are not able to highlight the presence of such distinct communities (e.g., <em>Beta</em> diversity map in Fig. 5), which is crucial when prioritising areas for conservation and urban development.</p>
<center>
<div class="cell" data-layout-align="center">
<div class="cell-output-display">
<div class="quarto-figure quarto-figure-center">
<figure class="figure">
<p><img src="https://xpsong.com/posts/biodivercity/assets/alpha-beta-gamma.png" class="img-fluid figure-img" style="width:70.0%"></p>
<figcaption>Figure 4: <em>Gamma</em> (total) diversity across a larger region is composed of <em>Alpha</em> (local) diversity or species richness at single sites, as well as the <em>Beta</em> (community) diversity representing the differences in species compositions between sites.</figcaption>
</figure>
</div>
</div>
</div>
</center>
<p><br></p>
<hr>
</section>
<section id="filling-the-gap-in-current-methods" class="level3">
<h3 class="anchored" data-anchor-id="filling-the-gap-in-current-methods">Filling the gap in current methods</h3>
<p>As part of a research project, our team has been developing methods to incorporate these missing elements (e.g., <em>Alpha</em>, <em>Beta</em>, and <em>Gamma</em> diversity) into predictive spatial modelling of biodiversity. We’ve also been working on an R package <a href="https://ecological-cities.github.io/biodivercity"><code>biodivercity</code></a> which will allow users to develop and apply such models for their own use cases, and validate model results based on data that they collect. Our method assesses the habitat suitability of landscapes based on their physical characteristics (e.g. <a href="../../posts/Intro2R-spatial">spatial patterns of land cover</a> from <a href="../../posts/city-landcover-change">satellite imagery</a> & LiDAR data, built elements from OpenStreetMap). However, instead of examining the effect of landscapes on individual species, we instead examine their effect on four major animal groups (birds, butterflies, odonates and amphibians) found in Singapore (Fig. 5).</p>
<center>
<div class="cell" data-layout-align="center">
<div class="cell-output-display">
<div class="quarto-figure quarto-figure-center">
<figure class="figure">
<p><img src="https://xpsong.com/posts/biodivercity/assets/framework.png" class="img-fluid figure-img" style="width:80.0%"></p>
<figcaption>Figure 5: Broad overview of the data workflow for a chosen animal group (e.g., birds). The <em>Alpha</em> diversity heat map represents the number of species, while the <em>Beta</em> diversity colour map represents unique species communities within a given area.</figcaption>
</figure>
</div>
</div>
</div>
</center>
<p><br></p>
<p>If models are built using remotely sensed landscape data, we can easily predict the diversity of each animal group across time and geographical space (Fig. 5). For example, the interactive map below shows the <em>Alpha</em> (local) diversity of odonates predicted across all <a href="https://data.gov.sg/dataset/master-plan-2019-subzone-boundary-no-sea">subzones</a><sup>5</sup> in Singapore during the year 2020, at a coarse pixel resolution of 100 hectares. Depending on the level of detail required, the pixel resolution can be adjusted accordingly (e.g., 0.1 hectares).</p>
<iframe seamless="" src="assets/plot_leaflet.html" width="100%" height="500">
</iframe>
<p><br></p>
<p>Pixel values can be subsequently summarised within the zones used in city planning, to allow comparisons to be made between these planning units (Fig. 5). The resulting distribution of the summarised values can subsequently be used to compare the ‘performance’ of each planning unit relative to others in the city, or to a set a benchmark/target for the desired level of ‘performance’ (Fig. 6). For example, subzones in Singapore could be benchmarked against the mean of the distribution, as shown below:</p>
<center>
<div class="cell" data-layout-align="center">
<div class="cell-output-display">
<div class="quarto-figure quarto-figure-center">
<figure class="figure">
<p><img src="https://xpsong.com/posts/biodivercity/assets/subzone-distribution-odonates.png" class="img-fluid figure-img" style="width:60.0%"></p>
<figcaption>Figure 6: Histogram showing the distribution of values for the average number of odonate species (<em>Alpha</em> diversity) per pixel within each of the 332 subzones in Singapore. Subzones were assigned an arbitrary score of -2 to 2 based on standard deviations from the mean (i.e., performance of the ‘average’ subzone).</figcaption>
</figure>
</div>
</div>
</div>
</center>
<p><br></p>
<p>If spatial predictions were made for multiple snapshots in time, benchmarking could be based on whether the average pixel value for a particular planning unit increases or decreases between two time periods (Fig. 5). For example, if ‘no net loss’ in biodiversity is set as a target, a negative score could be assigned if the average pixel value is reduced, while a positive score could be assigned if the average pixel value increases.</p>
<p>Finally, it is worth noting that our method allows full customisation of both the pixel size and boundaries within which to summarise the pixel values. This provides flexibility according to the level of analysis (e.g., geographical scale) required by the user. By summarising pixel values within zones used in city planning, animal diversity may be assessed alongside other indices also summarised at the level of these planning units, thus providing a more comprehensive view of components related to biodiversity and beyond (Fig. 5).</p>
<p><br></p>
<hr>
</section>
<section id="biodiversity-in-the-future" class="level3">
<h3 class="anchored" data-anchor-id="biodiversity-in-the-future">Biodiversity in the future</h3>
<p>Landscape data from remotely sensed sources allow biodiversity to be monitored in the past and present. However, there is also a need to assess future urban developments, for instance, to see if proposed designs can effectively mitigate the loss of biodiversity. But since such landscapes do not exist, snapshots of remotely sensed data can not be used. It is therefore important to carefully consider data compatibility between these different use cases when building and using the predictive models.</p>
<p>Urban design and planning involves the consideration of multiple design scenarios. Manually generated landscape elements (e.g., vector data for vegetation and water) may be produced from prospective designs, but the format and types of such data must be compatible with those used in the predictive models. For instance, when selecting landscape predictors to build the models, land cover classification as discrete rasters would be more compatible with manually generated data, compared to continuous rasters that cannot be feasibly calculated (e.g., spectral indices such as NDVI). Vegetation generated in design scenarios can be rasterized into discrete land cover-types (Fig. 7), and used to replace the remotely sensed data within regions of interest (Fig. 8). Such amendments to landscape data can be made across a site slated for urban development, and then used to make spatial predictions for that particular design scenario (see maps in Fig. 5).</p>
<center>
<div class="cell" data-layout-align="center">
<div class="cell-output-display">
<div class="quarto-figure quarto-figure-center">
<figure class="figure">
<p><img src="https://xpsong.com/posts/biodivercity/assets/data-compatibility-rasterize.png" class="img-fluid figure-img" style="width:100.0%"></p>
<figcaption>Figure 7: Example showing how manually generated vector data (points, polygons) of vegetation can be converted to a classified raster of vegetation types used in the predictive models.</figcaption>
</figure>
</div>
</div>
</div>
</center>
<p><br></p>
<center>
<div class="cell" data-layout-align="center">
<div class="cell-output-display">
<div class="quarto-figure quarto-figure-center">
<figure class="figure">
<p><img src="https://xpsong.com/posts/biodivercity/assets/data-compatibility-amend.png" class="img-fluid figure-img" style="width:100.0%"></p>
<figcaption>Figure 8: Example showing how a classified raster of remotely sensed vegetation can be amended with the manually generated vector data (rasterised). The dark green areas represent canopy cover, while the light green areas represent short vegetation.</figcaption>
</figure>
</div>
</div>
</div>
</center>
<p><br></p>
<p>While such data conversions may allow similar predictors (and hence models) to be used for different use cases, it should be noted that potential mismatches between different data sources may result in inaccurate predictions. For instance, the level of detail in design scenarios may not include the exact locations of planted trees, and their estimated canopy projection areas may vary greatly from reality after planting. Furthermore, the remotely sensed data represents a top-down view of the landscape, and the effect of multi-tiered planting is not accounted for within the landscape predictors. Collaboration between researchers and practitioners is needed to ensure that model workflows align with the data formats and outputs used in design practice, and that suitable methods are used to ensure that artificially generated datasets are both compatible and accurate to reality after implementation.</p>
<p><br></p>
<hr>
</section>
<section id="summing-it-up" class="level3">
<h3 class="anchored" data-anchor-id="summing-it-up">Summing it up</h3>
<p>Environmental impact assessments help keep governments and companies accountable as they embark on urban development projects. However, current methods do not provide the level of precision required to make quantitative comparisons between places and across time, especially into the future. We also need a much higher level of nuance when prioritising areas for development/conservation, in terms of the number (<em>Alpha</em> diversity) and communities (<em>Beta</em> diversity) of species spatially distributed across the landscape. We will soon be releasing the first version of our R package <a href="https://ecological-cities.github.io/biodivercity"><code>biodivercity</code></a>, which we hope will contribute to the larger toolbox of methods used to assess biodiversity in cities. Stay tuned!</p>
<p><br></p>
<p><br></p>
<hr>
</section>
<section id="acknowledgements" class="level3">
<h3 class="anchored" data-anchor-id="acknowledgements">Acknowledgements</h3>
<p>This blog post showcases some of the work undertaken in a research project to develop a biodiversity index for residential towns. It was funded from 2016–2022 under the Singapore Ministry of National Development Research Fund, awarded to the National University of Singapore in partnership with the Singapore Housing & Development Board. The lead investigator is Dr. Chong Kwek Yan, with co-investigators Dr. Hugh Tan and Dr. Darren Yeo. The working team includes Justin Nai, Edwin Tan, Hong Jhun Sim, Rachel Lee, and other members of the field team.</p>
<p><br></p>
<p>This post is also shared on <a href="https://r-bloggers.com">R-bloggers.com</a>.</p>
<p><br></p>
</section>
<div id="quarto-appendix" class="default"><section id="footnotes" class="footnotes footnotes-end-of-document"><h2 class="anchored quarto-appendix-heading">Footnotes</h2>
<ol>
<li id="fn1"><p>Baker, D. J., Maclean, I. M., Goodall, M., & Gaston, K. J. (2021). Species distribution modelling is needed to support ecological impact assessments. <em>Journal of Applied Ecology</em>, <em>58</em>(1), 21-26.↩︎</p></li>
<li id="fn2"><p>Bond, W. J. (1994). Keystone species. In <em>Biodiversity and ecosystem function</em> (pp. 237-253). Springer, Berlin, Heidelberg.↩︎</p></li>
<li id="fn3"><p>Power, M. E., Tilman, D., Estes, J. A., Menge, B. A., Bond, W. J., Mills, L. S., … & Paine, R. T. (1996). Challenges in the quest for keystones: identifying keystone species is difficult—but essential to understanding how loss of species will affect ecosystems. <em>BioScience</em>, <em>46</em>(8), 609-620.↩︎</p></li>
<li id="fn4"><p>Schmitt, S., Pouteau, R., Justeau, D., De Boissieu, F., & Birnbaum, P. (2017). ssdm: An r package to predict distribution of species richness and composition based on stacked species distribution models. <em>Methods in Ecology and Evolution</em>, <em>8</em>(12), 1795-1803.↩︎</p></li>
<li id="fn5"><p>Government of Singapore (2020). <a href="https://data.gov.sg/dataset/master-plan-2019-subzone-boundary-no-sea">Master Plan 2019 Subzone Boundary (No Sea)</a>. <em>data.gov.sg</em>. Released under the terms of the <a href="https://data.gov.sg/open-data-licence">Singapore Open Data Licence version 1.0</a>.↩︎</p></li>
</ol>
</section><section class="quarto-appendix-contents" id="quarto-citation"><h2 class="anchored quarto-appendix-heading">Citation</h2><div><div class="quarto-appendix-secondary-label">BibTeX citation:</div><pre class="sourceCode code-with-copy quarto-appendix-bibtex"><code class="sourceCode bibtex">@software{x._p.2022,
author = {X. P. , Song and E. Y. W. , Tan and S. K. R. , Lee and H. J.
, Sim and J. , Nai and K. Y. , Chong},
title = {Biodivercity: {An} {R} Package for Spatial Assessment of
Biodiversity Across City Landscapes},
version = {0.1.0},
date = {2022},
url = {https://xpsong.com/posts/biodivercity/},
doi = {10.5281/zenodo.7410414},
langid = {en}
}
</code></pre><div class="quarto-appendix-secondary-label">For attribution, please cite this work as:</div><div id="ref-x._p.2022" class="csl-entry quarto-appendix-citeas">
X. P., Song, Tan E. Y. W., Lee S. K. R., Sim H. J., Nai J., and Chong K.
Y. 2022. <span>“Biodivercity: An R Package for Spatial Assessment of
Biodiversity Across City Landscapes.”</span> <a href="https://doi.org/10.5281/zenodo.7410414">https://doi.org/10.5281/zenodo.7410414</a>.
</div></div></section></div> ]]></description>
<category>R</category>
<category>Research</category>
<category>Software</category>
<guid>https://xpsong.com/posts/biodivercity/</guid>
<pubDate>Sun, 31 Jul 2022 16:00:00 GMT</pubDate>
<media:content url="https://xpsong.com/posts/biodivercity/featured.jpg" medium="image" type="image/jpeg"/>
</item>
<item>
<title>Nathan’s first drumset</title>
<dc:creator>XP Song</dc:creator>
<link>https://xpsong.com/posts/nathans-first-drumset/</link>
<description><![CDATA[
<p>Sample it before it gets wrecked!</p>
<center>
<div style="position: relative; padding-bottom: 56.25%; height: 0; overflow: hidden;">
<iframe src="https://www.youtube.com/embed/i90zVvyJaZk" style="position: absolute; top: 0; left: 0; width: 100%; height: 100%; border:0;" allowfullscreen="" title="Nathan's First Drumset">
</iframe>
</div>
</center>
<p><br></p>
]]></description>
<category>Music</category>
<guid>https://xpsong.com/posts/nathans-first-drumset/</guid>
<pubDate>Sun, 09 Jan 2022 16:00:00 GMT</pubDate>
<media:content url="https://xpsong.com/posts/nathans-first-drumset/featured.jpg" medium="image" type="image/jpeg"/>
</item>
<item>
<title>Trying to fight cabin fever? The types of parks near your home matters</title>
<dc:creator>XP Song</dc:creator>
<link>https://xpsong.com/posts/home2park/</link>
<description><![CDATA[
<p>The COVID-19 pandemic is changing the way we work and play<sup>1</sup>. As more people work remotely from home, visits to local parks have surged as people look for ways to stay active, find solace and quench their thirst for adventure<sup>2</sup>. Such opportunities for leisure and recreation play an important part in people’s physical and psychological well-being<sup>3</sup>.</p>
<center>
<div class="quarto-figure quarto-figure-center">
<figure class="figure">
<p><img src="https://xpsong.com/posts/home2park/assets/bishan-amk.JPG" class="img-fluid figure-img"></p>
<figcaption>An elderly man sketching the landscape at Bishan-Ang Mo Kio Park in Singapore</figcaption>
</figure>
</div>
</center>
<p><br></p>
<p>Not all parks are created equal. For example, people seeking solace may be drawn to waterfront or nature parks with open or tranquil landscapes<sup>4</sup>. For others who love adventure (like myself), bike trails are great to break a sweat while having fun with friends. But most of the bike trails in Singapore<sup>5</sup> are really hard to get to from where I live (not to mention figuring out bike transport). So, instead of heading out for some adventure, I tend to visit the open field and playground closest to home, going out for short walks with my son (still waiting for my next adrenaline fix!).</p>
<center>
<div class="quarto-figure quarto-figure-center">
<figure class="figure">
<p><img src="https://xpsong.com/posts/home2park/assets/upper-peirce-reservoir.jpg" class="img-fluid figure-img"></p>
<figcaption><em>Waterfront landscapes at Upper Peirce Reservoir in Singapore</em></figcaption>
</figure>
</div>
<div class="quarto-figure quarto-figure-center">
<figure class="figure">
<p><img src="https://xpsong.com/posts/home2park/assets/canal.jpeg" class="img-fluid figure-img"></p>
<figcaption><em>A very different kind of ‘waterfront’ closer to home</em></figcaption>
</figure>
</div>
</center>
<p><br></p>
<p>In city planning, park provision is typically measured by summarising the park area within a given region<sup>6</sup>. Beyond the area of parks, however, there is a need to characterise the wide variety of parks that serve different groups of people. Understanding such nuances in park access can benefit both residents and city planners, for instance, when considering the location of a new home, or when planning for new parks in the city.</p>
<center>
<div class="quarto-figure quarto-figure-center">
<figure class="figure">
<p><img src="https://xpsong.com/posts/home2park/assets/park-type.png" class="img-fluid figure-img"></p>
<figcaption>Basic classification of parks in Singapore based on OpenStreetMap (OSM) data in 2020. Includes outdoor spaces such as beaches and informal nature areas.</figcaption>
</figure>
</div>
</center>
<p><br></p>
<p>Our new R package <a href="https://ecological-cities.github.io/home2park"><code>home2park</code></a> provides ways to measure a variety of park features related to recreation such as foot/cycle trails, waterfronts, forests, open spaces, playgrounds and sport/fitness amenities. It includes functions to download such data from OSM. Alternatively, you may supply your own proprietary datasets, or use new data for the purpose of future scenario planning. Summarising these features at each park allows us to make quantitative comparisons between parks. Some examples are shown below for the city of Singapore, based on OSM data in the year 2020. For instance, parks with extensive trails provide opportunities for running/wheeled sports:</p>
<center>
<div class="quarto-figure quarto-figure-center">
<figure class="figure">
<p><img src="https://xpsong.com/posts/home2park/assets/park-trails.png" class="img-fluid figure-img"></p>
<figcaption><em>Length of <a href="https://ecological-cities.github.io/home2park/reference/get_trails_osm.html">trails</a> (foot- and cycle-ways) in Singapore</em></figcaption>
</figure>
</div>
<div class="quarto-figure quarto-figure-center">
<figure class="figure">
<p><img src="https://xpsong.com/posts/home2park/assets/marina-barrage.jpg" class="img-fluid figure-img"></p>
<figcaption><em>Cycling with our little one along East Coast Park (a rather rare occasion)</em></figcaption>
</figure>
</div>
<p><br></p>
</center>
<p>Other features such as water and vegetation are associated with visual relief, as well as restorative effects on people’s psychological well-being<sup>7</sup>. If such data (e.g., satellite imagery) are available, they can also be <a href="https://ecological-cities.github.io/home2park/reference/parks_calc_attributes.html">summarised</a> at each park:</p>
<center>
<div class="quarto-figure quarto-figure-center">
<figure class="figure">
<p><img src="https://xpsong.com/posts/home2park/assets/park-waterfronts.png" class="img-fluid figure-img"></p>
<figcaption><em>Length of waterfronts within & near parks in Singapore</em></figcaption>
</figure>
</div>
<div class="quarto-figure quarto-figure-center">
<figure class="figure">
<p><img src="https://xpsong.com/posts/home2park/assets/park-forests.png" class="img-fluid figure-img"></p>
<figcaption><em>Area of dense vegetation within parks in Singapore</em></figcaption>
</figure>
</div>
<div class="quarto-figure quarto-figure-center">
<figure class="figure">
<p><img src="https://xpsong.com/posts/home2park/assets/upper-peirce-reservoir2.jpg" class="img-fluid figure-img"></p>
<figcaption><em>Large water bodies and forest patches are present at Upper Peirce Reservoir Park</em></figcaption>
</figure>
</div>
<p><br></p>
</center>
<p>To calculate the ‘supply’ of these parks features to homes city-wide, these summarised values per park can then be assigned to each residential building. Since people are less likely to visit parks further away from their homes, a ‘distance decay’ parameter can be included. This reduces the supply value originating from parks further away. The ‘distance decay’ for walking to urban parks was empirically determined in Tu et al.<sup>8</sup> to fit a negative exponential curve with a coefficient <em>c</em> value of <em>0.661</em> (see <a href="https://ecological-cities.github.io/home2park"><code>home2park</code></a> package for details). The example below shows the supply of park area to residential buildings in Singapore.</p>
<center>
<div class="quarto-figure quarto-figure-center">
<figure class="figure">
<p><img src="https://xpsong.com/posts/home2park/assets/ss-park-area-buildings.png" class="img-fluid figure-img"></p>
<figcaption><em>The supply of park area to residential buildings in the year 2020, based on a ‘distance decay’ coefficient c</em> value of 0.661. Each building is denoted as a point (a random subset is shown). Details in the <a href="https://ecological-cities.github.io/home2park/">package website</a>.</figcaption>
</figure>
</div>
<p><br></p>
</center>
<p>Notice how the largest parks in Singapore tend to be centrally located, especially since we include informal nature areas (e.g., Central Catchment Nature Reserve). Buildings closer to these huge parks would thus have a much larger ‘supply’ of park area. However, each of the residential buildings would have different numbers of people living in them. A large park next to high-rise apartments would benefit a lot more people compared to low-rise housing. If we take into account the number of residents living within each building, a very different picture emerges!</p>
<center>
<div class="quarto-figure quarto-figure-center">
<figure class="figure">
<p><img src="https://xpsong.com/posts/home2park/assets/ss-park-area-residents.png" class="img-fluid figure-img"></p>
<figcaption><em>The supply of park area to all building residents in the year 2020. Each building is denoted as a point (a random subset is shown). Details in the <a href="https://ecological-cities.github.io/home2park/">package website</a>.</em></figcaption>
</figure>
</div>
<p><br></p>
</center>
<p>These building-level metrics can subsequently be summarised across larger regions, for example, based on <a href="https://data.gov.sg/dataset/master-plan-2019-subzone-boundary-no-sea">zones</a> used in city planning. Not surprisingly, the supply metric for park area derived from individual buildings is vastly different from conventional metrics used today (examples in Tan and Samsudin<sup>9</sup>). Feel free to explore these metrics for the provision of park area in the interactive map below.</p>
<p><br></p>
<iframe seamless="" src="assets/plot_leaflet.html" width="100%" height="500">
</iframe>
<center>
<strong>Map: Supply of park area in Singapore based on OSM data (2020).</strong> Each building is denoted as a point (a random subset is shown). The colour palettes were binned according to quantile values.
</center>
<p><br></p>
<p>In these unprecedented times, significant changes to human mobility and working environments will require us to rethink land use distribution in the city, particularly for office, residential and public spaces such as parks. Some evidence also point to shifting preferences, for example, toward wilder green areas<sup>10</sup>. These issues are starting to become a part of the national conversation in Singapore <sup>11</sup>, which is important considering that the city has major plans for sustainable development and greening over the next decade<sup>12</sup>. As part of the Singapore Green Plan 2030, there will be 1,000 ha (more than 1,800 football fields!) more parks and park connectors, and every household will live just a 10-minute walk away from a park<sup>13</sup>. In view of these major plans, there is a need to move beyond basic summaries of area, and consider the variety of parks features that are important for outdoor recreation. The R package <a href="https://ecological-cities.github.io/home2park"><code>home2park</code></a> represents a part of our effort to contribute to a more nuanced understanding of outdoor recreation and its spatial provision in cities. Note that the package is still experimental, so do reach out if you would like to <a href="https://ecological-cities.github.io/home2park/CONTRIBUTING.html">contribute</a> to improvements, or to report any <a href="https://github.com/ecological-cities/home2park/issues">bugs</a>.</p>
<p><br></p>
<section id="acknowledgements" class="level2">
<h2 class="anchored" data-anchor-id="acknowledgements">Acknowledgements</h2>
<p>I’d like to gratefully acknowledge Kwek Yan for his valuable inputs and help to re-fine these ideas we are working on here, as well as Edwin Tan and Justin Nai for the useful discussions. The analyses shown in this blog post is part of a larger project that investigates the balance of park supply and demand in cities between different groups of people, as well as how the ‘distance decay’ affects their use. Do reach out if you are keen to collaborate on related topics. Finally, to cite <a href="https://ecological-cities.github.io/home2park/"><code>home2park</code></a> or acknowledge its use, you may use the following reference:</p>
<blockquote class="blockquote">
<p>Song, X. P., Chong, K. Y. (2021). home2park: An R package to assess the spatial provision of urban parks. <em>Journal of Open Source Software</em>, <em>6</em>(65), 3609. <a href="https://doi.org/10.21105/joss.03609" class="uri">https://doi.org/10.21105/joss.03609</a></p>
</blockquote>
<p><br></p>
<p><br></p>
<p>This post is also shared on <a href="https://r-bloggers.com">R-bloggers.com</a>.</p>
<p><br></p>
</section>
<div id="quarto-appendix" class="default"><section id="footnotes" class="footnotes footnotes-end-of-document"><h2 class="anchored quarto-appendix-heading">Footnotes</h2>
<ol>
<li id="fn1"><p>Hoffower, H. (2021, June 23). <a href="https://www.businessinsider.com/marc-andreessen-remote-work-more-important-than-internet-civilizational-shift-2021-6">Tech legend Marc Andreessen says the rise of remote work might be more important than the internet: ‘A permanent civilizational shift’</a>. <em>Business Insider.</em>↩︎</p></li>
<li id="fn2"><p>Geng, D. C., Innes, J., Wu, W., & Wang, G. (2021). Impacts of COVID-19 pandemic on urban park visitation: a global analysis. <em>Journal of forestry research</em>, 32(2), 553-567.<br>
Ting, W. P. (2020, November 1). <a href="https://www.todayonline.com/singapore/hiking-boom-singapore-more-turn-their-backyard-cope-covid-19-travel-restrictions">Hiking boom in Singapore as more turn to their backyard to cope with Covid-19 travel restrictions</a>. <em>Today</em>. https://www.todayonline.com<br>
Nadarajan, R. (2020, April 2). <a href="https://www.todayonline.com/singapore/covid-19-seeking-fun-and-solace-worries-wide-open-spaces-singaporeans-turn-outdoors">Covid-19: Seeking fun and solace from worries in wide open spaces, Singaporeans turn to the outdoors</a>. <em>Today</em>.↩︎</p></li>
<li id="fn3"><p>Wood, L., Hooper, P., Foster, S., & Bull, F. (2017). Public green spaces and positive mental health–investigating the relationship between access, quantity and types of parks and mental wellbeing. <em>Health & place</em>, 48, 63-71.<br>
Rajoo, K. S., Karam, D. S., & Abdullah, M. Z. (2020). The physiological and psychosocial effects of forest therapy: A systematic review. <em>Urban Forestry & Urban Greening</em>, 126744.↩︎</p></li>
<li id="fn4"><p>Examples in Singapore include <a href="https://www.nparks.gov.sg/gardens-parks-and-nature/parks-and-nature-reserves/east-coast-park">East Coast Park</a>, <a href="https://www.nparks.gov.sg/gardens-parks-and-nature/parks-and-nature-reserves/upper-peirce-reservoir-park">Upper Peirce Reservoir</a>, <a href="https://www.nparks.gov.sg/gardens-parks-and-nature/parks-and-nature-reserves/bukit-batok-nature-park">Bukit Batok Nature Park</a>↩︎</p></li>
<li id="fn5"><p>http://shimanocyclingworld.com/5-best-mountain-bike-trails-in-singapore/↩︎</p></li>
<li id="fn6"><p>Tan, P. Y., & Samsudin, R. (2017). Effects of spatial scale on assessment of spatial equity of urban park provision. <em>Landscape and Urban Planning</em>, 158, 139-154.↩︎</p></li>
<li id="fn7"><p>Wood, L., Hooper, P., Foster, S., & Bull, F. (2017). Public green spaces and positive mental health–investigating the relationship between access, quantity and types of parks and mental wellbeing. <em>Health & place</em>, 48, 63-71.<br>
Rajoo, K. S., Karam, D. S., & Abdullah, M. Z. (2020). The physiological and psychosocial effects of forest therapy: A systematic review. <em>Urban Forestry & Urban Greening</em>, 126744.↩︎</p></li>
<li id="fn8"><p>Tu, X., Huang, G., Wu, J., & Guo, X. (2020). How do travel distance and park size influence urban park visits?. <em>Urban Forestry & Urban Greening</em>, 52, 126689.↩︎</p></li>
<li id="fn9"><p>Tan, P. Y., & Samsudin, R. (2017). Effects of spatial scale on assessment of spatial equity of urban park provision. <em>Landscape and Urban Planning</em>, 158, 139-154.↩︎</p></li>
<li id="fn10"><p>Grzyb, T., Kulczyk, S., Derek, M., & Woźniak, E. (2021). Using social media to assess recreation across urban green spaces in times of abrupt change. <em>Ecosystem Services</em>, 49, 101297.↩︎</p></li>
<li id="fn11"><p>Ng, K. G. (2021, June 23). <a href="https://www.straitstimes.com/singapore/understand-quality-not-just-quantity-of-green-spaces-expert">Understand quality, not just quantity, of green spaces:Expert</a>. <em>The Straits Times.</em>↩︎</p></li>
<li id="fn12"><p>Tan, A. (2021, February 10). <a href="https://www.straitstimes.com/singapore/environment/singapore-green-plan-2030-to-change-the-way-people-live-work-study-and-play">Singapore Green Plan 2030 to change the way people live, work, study and play</a>. <em>The Straits Times</em>.<br>
Lin, C. (2021, March 4). <a href="https://www.channelnewsasia.com/news/singapore/nparks-new-trails-cross-island-corridor-parks-14332300">NParks to develop Singapore’s longest cross-island trail and 3 new recreational routes</a>. <em>Channel News Asia</em>.↩︎</p></li>
<li id="fn13"><p>Au-Yong, R. (2019, March 28). <a href="https://www.straitstimes.com/singapore/spore-to-have-1000ha-more-parks-and-park-connectors">Singapore to have 1,000ha more parks and park connectors</a>. <em>The Straits Times</em>.<br>
Singapore Green Plan 2030 (2021). <a href="https://www.greenplan.gov.sg/key-focus-areas/city-in-nature/">City in Nature</a>.↩︎</p></li>
</ol>
</section><section class="quarto-appendix-contents" id="quarto-citation"><h2 class="anchored quarto-appendix-heading">Citation</h2><div><div class="quarto-appendix-secondary-label">BibTeX citation:</div><pre class="sourceCode code-with-copy quarto-appendix-bibtex"><code class="sourceCode bibtex">@article{x._p.2021,
author = {X. P. , Song and K. Y. , Chong},
title = {Home2park: {An} {R} Package to Assess the Spatial Provision
of Urban Parks},
journal = {Journal of Open Source Software},
volume = {6},
number = {66},
pages = {3609},
date = {2021},
url = {https://xpsong.com/posts/home2park/},
doi = {10.21105/joss.03609},
langid = {en}
}
</code></pre><div class="quarto-appendix-secondary-label">For attribution, please cite this work as:</div><div id="ref-x._p.2021" class="csl-entry quarto-appendix-citeas">
X. P., Song, and Chong K. Y. 2021. <span>“Home2park: An R Package to
Assess the Spatial Provision of Urban Parks.”</span> <em>Journal of Open
Source Software</em> 6 (66): 3609. <a href="https://doi.org/10.21105/joss.03609">https://doi.org/10.21105/joss.03609</a>.
</div></div></section></div> ]]></description>
<category>R</category>
<category>Research</category>
<category>Software</category>
<guid>https://xpsong.com/posts/home2park/</guid>
<pubDate>Thu, 15 Jul 2021 16:00:00 GMT</pubDate>
<media:content url="https://xpsong.com/posts/home2park/featured.jpg" medium="image" type="image/jpeg"/>
</item>
<item>
<title>Changes in a city’s land cover over time</title>
<dc:creator>XP Song</dc:creator>
<link>https://xpsong.com/posts/city-landcover-change/</link>
<description><![CDATA[
<p>As part of a research project to develop biodiversity indices for city planning, I’ve had to quantify different components of the landscape using satellite data. One of these components is land cover, from which other metrics can be further derived. So far, this has been done for <a href="https://sentinel.esa.int/web/sentinel/missions/sentinel-2">Sentinel-2</a> and <a href="https://earth.esa.int/eogateway/missions/skysat">Skysat</a> data. Here is a brief summary of the steps and template R code used to derive land cover classes from publicly-available Sentinel-2 imagery. We’re open to collaborate and explore new applications in remote sensing, so we’d love to hear from you if you have any feedback or ideas!</p>
<p><br></p>
<hr>
<p>First, load the required R packages:</p>
<div class="cell">
<div class="sourceCode cell-code" id="cb1" style="background: #f1f3f5;"><pre class="sourceCode r code-with-copy"><code class="sourceCode r"><span id="cb1-1"><span class="fu" style="color: #4758AB;
background-color: null;
font-style: inherit;">library</span>(tidyverse)</span>
<span id="cb1-2"></span>
<span id="cb1-3"><span class="co" style="color: #5E5E5E;
background-color: null;
font-style: inherit;"># sentinel-2 data</span></span>
<span id="cb1-4"><span class="fu" style="color: #4758AB;
background-color: null;
font-style: inherit;">library</span>(sen2r)</span>
<span id="cb1-5"></span>
<span id="cb1-6"><span class="co" style="color: #5E5E5E;
background-color: null;
font-style: inherit;"># image processing</span></span>
<span id="cb1-7"><span class="fu" style="color: #4758AB;
background-color: null;
font-style: inherit;">library</span>(EBImage)</span>
<span id="cb1-8"></span>
<span id="cb1-9"><span class="co" style="color: #5E5E5E;
background-color: null;
font-style: inherit;"># spatial analysis</span></span>
<span id="cb1-10"><span class="fu" style="color: #4758AB;
background-color: null;
font-style: inherit;">library</span>(terra)</span></code></pre></div>
</div>
<p><br></p>
<p>And here’s the raw shape file for our area of interest:</p>
<center>
<div class="cell" data-layout-align="center">
<div class="cell-output-display">
<div class="quarto-figure quarto-figure-center">
<figure class="figure">
<p><img src="https://xpsong.com/posts/city-landcover-change/index_files/figure-html/unnamed-chunk-3-1.png" class="img-fluid figure-img" width="2100"></p>
<figcaption>Figure 1: Singapore subzone boundaries based on the <a href="https://data.gov.sg/dataset/master-plan-2019-subzone-boundary-no-sea">Master Plan 2019</a>.</figcaption>
</figure>
</div>
</div>
</div>
</center>
<p><br></p>
<section id="calculate-spectral-indices" class="level2">
<h2 class="anchored" data-anchor-id="calculate-spectral-indices">Calculate spectral indices</h2>
<p>To download Sentinel-2 images, we can use the R package <a href="http://sen2r.ranghetti.info"><code>sen2r</code></a> to programatically download the satellite data within a specified date range. It also allows us to run pre-processing steps such as cloud masking, atmospheric correction and the calculation of spectral indices.</p>
<div class="cell">
<div class="sourceCode cell-code" id="cb2" style="background: #f1f3f5;"><pre class="sourceCode r code-with-copy"><code class="sourceCode r"><span id="cb2-1"><span class="co" style="color: #5E5E5E;
background-color: null;
font-style: inherit;"># get search parameters</span></span>
<span id="cb2-2">json_path <span class="ot" style="color: #003B4F;
background-color: null;
font-style: inherit;"><-</span> <span class="st" style="color: #20794D;
background-color: null;
font-style: inherit;">"<PATH TO YOUR JSON FILE>"</span></span>
<span id="cb2-3"><span class="co" style="color: #5E5E5E;
background-color: null;
font-style: inherit;"># you can create this file by running 'sen2r()', then using the graphical user interface to specify & save your parameters (e.g. max cloud cover per image, etc.)</span></span>
<span id="cb2-4"></span>
<span id="cb2-5"></span>
<span id="cb2-6"><span class="co" style="color: #5E5E5E;
background-color: null;
font-style: inherit;"># download</span></span>
<span id="cb2-7">out_paths <span class="ot" style="color: #003B4F;
background-color: null;
font-style: inherit;"><-</span> <span class="fu" style="color: #4758AB;
background-color: null;
font-style: inherit;">sen2r</span>(</span>
<span id="cb2-8"> <span class="at" style="color: #657422;
background-color: null;
font-style: inherit;">param_list =</span> json_path,</span>
<span id="cb2-9"> <span class="at" style="color: #657422;
background-color: null;
font-style: inherit;">extent_as_mask =</span> <span class="cn" style="color: #8f5902;
background-color: null;
font-style: inherit;">TRUE</span>, <span class="co" style="color: #5E5E5E;
background-color: null;
font-style: inherit;"># mask the image based on your supplied shape file</span></span>
<span id="cb2-10"> <span class="at" style="color: #657422;
background-color: null;
font-style: inherit;">list_rgb =</span> <span class="st" style="color: #20794D;
background-color: null;
font-style: inherit;">"RGB432B"</span> <span class="co" style="color: #5E5E5E;
background-color: null;
font-style: inherit;"># output RGB image</span></span>
<span id="cb2-11"> )</span></code></pre></div>
</div>
<p>For each data type (e.g., spectral index), we can combine all raster images captured within the specified date range by averaging the pixel values across files (thus forming an image mosaic). This allows us to avoid relying on any one image for our analysis, and to deal with missing data (e.g., due to high cloud cover) during the period of interest. Depending on the data type, we can scale the values and remove outliers prior to forming the image mosaic. You might also want to consider parallelising the code if there are many files. Run the following code for each data type:</p>
<div class="cell">
<div class="sourceCode cell-code" id="cb3" style="background: #f1f3f5;"><pre class="sourceCode r code-with-copy"><code class="sourceCode r"><span id="cb3-1">filepaths <span class="ot" style="color: #003B4F;
background-color: null;
font-style: inherit;"><-</span> <span class="st" style="color: #20794D;
background-color: null;
font-style: inherit;">"<PATHS TO YOUR PROCESSED FILES>"</span></span>
<span id="cb3-2"></span>
<span id="cb3-3">images <span class="ot" style="color: #003B4F;
background-color: null;
font-style: inherit;"><-</span> <span class="fu" style="color: #4758AB;
background-color: null;
font-style: inherit;">lapply</span>(filepaths, rast) <span class="co" style="color: #5E5E5E;
background-color: null;
font-style: inherit;"># import rasters as a list</span></span>
<span id="cb3-4">mosaic <span class="ot" style="color: #003B4F;
background-color: null;
font-style: inherit;"><-</span> <span class="fu" style="color: #4758AB;
background-color: null;
font-style: inherit;">do.call</span>(terra<span class="sc" style="color: #5E5E5E;
background-color: null;
font-style: inherit;">::</span>mosaic, <span class="fu" style="color: #4758AB;
background-color: null;
font-style: inherit;">c</span>(images, <span class="fu" style="color: #4758AB;
background-color: null;
font-style: inherit;">list</span>(<span class="at" style="color: #657422;
background-color: null;
font-style: inherit;">fun =</span> <span class="st" style="color: #20794D;
background-color: null;
font-style: inherit;">"mean"</span>)))</span>
<span id="cb3-5"></span>
<span id="cb3-6"><span class="co" style="color: #5E5E5E;
background-color: null;
font-style: inherit;"># export mosaic</span></span>
<span id="cb3-7"><span class="fu" style="color: #4758AB;
background-color: null;
font-style: inherit;">writeRaster</span>(mosaic, <span class="st" style="color: #20794D;
background-color: null;
font-style: inherit;">"<MOSAIC FILE NAME>.tif"</span>,</span>
<span id="cb3-8"> <span class="at" style="color: #657422;
background-color: null;
font-style: inherit;">wopt =</span> <span class="fu" style="color: #4758AB;
background-color: null;
font-style: inherit;">list</span>(<span class="at" style="color: #657422;
background-color: null;
font-style: inherit;">gdal=</span><span class="fu" style="color: #4758AB;
background-color: null;
font-style: inherit;">c</span>(<span class="st" style="color: #20794D;
background-color: null;
font-style: inherit;">"COMPRESS=LZW"</span>)), <span class="co" style="color: #5E5E5E;
background-color: null;
font-style: inherit;"># compress output</span></span>
<span id="cb3-9"> <span class="at" style="color: #657422;
background-color: null;
font-style: inherit;">overwrite =</span> <span class="cn" style="color: #8f5902;
background-color: null;
font-style: inherit;">TRUE</span>)</span></code></pre></div>
</div>
<p><br></p>
</section>
<section id="classify-land-cover" class="level2">
<h2 class="anchored" data-anchor-id="classify-land-cover">Classify land cover</h2>
<p>At this point, we have a single image mosaic (raster) for each spectral index. While the continuous values from these rasters may be used directly in analyses, there may be instances were we want to work with discrete classes of land cover. One method is to separate pixels into one of two classes (e.g., vegetated or non-vegetated; water or land), based on an adaptively derived threshold value. For example, we use Otsu’s thresholding (<a href="https://cw.fel.cvut.cz/wiki/_media/courses/a6m33bio/otsu.pdf">Otsu, 1979</a>), which tends to outperform other techniques in terms of stability of results and processing speed, even with the presence of > 2 peaks in the histogram of pixel values (<a href="https://www.tandfonline.com/doi/abs/10.1080/10106049.2018.1497094">Bouhennache et al., 2019</a>; see figure below). This may be implemented using the <a href="https://www.rdocumentation.org/packages/EBImage/versions/4.14.2/topics/otsu"><code>otsu()</code></a> function in <code>library(EBImage)</code>:</p>
<div class="cell">
<div class="sourceCode cell-code" id="cb4" style="background: #f1f3f5;"><pre class="sourceCode r code-with-copy"><code class="sourceCode r"><span id="cb4-1">threshold_value <span class="ot" style="color: #003B4F;
background-color: null;
font-style: inherit;"><-</span> EBImage<span class="sc" style="color: #5E5E5E;
background-color: null;
font-style: inherit;">::</span><span class="fu" style="color: #4758AB;
background-color: null;
font-style: inherit;">otsu</span>(mosaic, </span>
<span id="cb4-2"> <span class="at" style="color: #657422;
background-color: null;
font-style: inherit;">range =</span> <span class="fu" style="color: #4758AB;
background-color: null;
font-style: inherit;">c</span>(<span class="sc" style="color: #5E5E5E;
background-color: null;
font-style: inherit;">-</span><span class="dv" style="color: #AD0000;
background-color: null;
font-style: inherit;">1</span>, <span class="dv" style="color: #AD0000;
background-color: null;
font-style: inherit;">1</span>), <span class="co" style="color: #5E5E5E;
background-color: null;
font-style: inherit;"># histogram range</span></span>
<span id="cb4-3"> <span class="at" style="color: #657422;
background-color: null;
font-style: inherit;">levels =</span> <span class="dv" style="color: #AD0000;
background-color: null;
font-style: inherit;">256</span>) <span class="co" style="color: #5E5E5E;
background-color: null;
font-style: inherit;"># depends on image bit-depth</span></span>
<span id="cb4-4"></span>
<span id="cb4-5">classified <span class="ot" style="color: #003B4F;
background-color: null;
font-style: inherit;"><-</span> mosaic <span class="co" style="color: #5E5E5E;
background-color: null;
font-style: inherit;"># duplicate</span></span>
<span id="cb4-6"></span>
<span id="cb4-7">classified[mosaic <span class="sc" style="color: #5E5E5E;
background-color: null;
font-style: inherit;"><</span> threshold_value] <span class="ot" style="color: #003B4F;
background-color: null;
font-style: inherit;"><-</span> <span class="dv" style="color: #AD0000;
background-color: null;
font-style: inherit;">0</span> <span class="co" style="color: #5E5E5E;
background-color: null;
font-style: inherit;"># assign value of 0 for pixels below threshold (e.g. non-vegetated)</span></span>
<span id="cb4-8">classified[mosaic <span class="sc" style="color: #5E5E5E;
background-color: null;
font-style: inherit;">>=</span> threshold_value] <span class="ot" style="color: #003B4F;
background-color: null;
font-style: inherit;"><-</span> <span class="dv" style="color: #AD0000;
background-color: null;
font-style: inherit;">1</span> <span class="co" style="color: #5E5E5E;
background-color: null;
font-style: inherit;"># assign value of 1 for pixels above threshold (e.g. vegetated)</span></span></code></pre></div>
</div>
<p>As an example, the following figure shows the distribution of the <a href="https://www.indexdatabase.de/db/i-single.php?id=58">Normalized Difference Vegetation Index</a> (NDVI) values of a raster image. The NDVI is a measure of healthy green vegetation, based on the tendency of plants to reflect NIR & absorb red light. It ranges from -1 (non-vegetated) to 1 (densely vegetated). Pixels that fall within the range of different threshold values (vertical lines) may be classified into discrete land cover types. If we do this for multiple date ranges, we can examine differences between the two. For example, in our project, we are currently comparing image mosaics captured during 2016–2019 (Survey Round One) and those captured during 2019–2022 (Survey Round Two). Each spectral index can be processed in different ways, and often have different threshold values if they are used for land classification. For example, there are numerous other vegetation indices (e.g., NDRE, ARVI), as well as spectral indices used to classify water (e.g., NDWI) and built (e.g., NDBI) cover.</p>
<center>
<div class="cell" data-layout-align="center">
<div class="cell-output-display">
<div class="quarto-figure quarto-figure-center">
<figure class="figure">
<p><img src="https://xpsong.com/posts/city-landcover-change/assets/NDVI.png" class="img-fluid figure-img" style="width:70.0%"></p>
<figcaption>Figure 2: Distribution of NDVI values across Singapore in Survey Round One (light green bars, dashed vertical lines) and Two (dark green bars, solid vertical lines). Vegetation was classified as pixel with values > 0.35 for Round One and > 0.36 for Round Two. Dense vegetation was classified as pixels with values > 0.62 for Round One and > 0.66 for Round Two, based on a second round of Otsu’s thresholding after the first round of classification.</figcaption>
</figure>
</div>
</div>
</div>
</center>
<p><br></p>
</section>
<section id="accuracy-assessments" class="level2">
<h2 class="anchored" data-anchor-id="accuracy-assessments">Accuracy assessments</h2>
<p>How do we know that the land cover classes we have derived are accurate? Some form of ground-truthing is required. In our research project, on-site mapping has been performed at sampling points over the years.</p>
<center>
<div class="quarto-figure quarto-figure-center">
<figure class="figure">
<p><img src="https://xpsong.com/posts/city-landcover-change/assets/ground-truthing.png" class="img-fluid figure-img"></p>
<figcaption>Figure 3: Screenshot showing vegetation derived from on-site surveys and satellite imagery at sampling points in Punggol, Singapore.</figcaption>
</figure>
</div>
</center>
<p>The next figure shows a basic comparison of land cover area between surveyed (x-axes) and satellite (y-axes) data. As we can see, there are positive relationships between the two (particularly for vegetation cover), but the relatively large root-mean-square errors (RMSE) show that low resolution satellite imagery does have its limitations.</p>
<center>
<div class="quarto-figure quarto-figure-center">
<figure class="figure">
<p><img src="https://xpsong.com/posts/city-landcover-change/assets/validation.png" class="img-fluid figure-img"></p>
<figcaption>Figure 4: Area of (a) vegetation, (b) built and (c) water cover derived from Sentinel-2 compared with on-site surveys at sampling points.</figcaption>
</figure>
</div>
</center>
<p><br></p>
</section>
<section id="summarise-per-zone" class="level2">
<h2 class="anchored" data-anchor-id="summarise-per-zone">Summarise per zone</h2>
<p>Now that we have land cover classified during different periods of time, one way to compare differences in land cover is to summarise them according to the zones used in city planning. Spectral indices (whether classified or not) can be summarised within each zone, to allow comparisons to be made between these planning units.</p>
<p>Here’s a map showing the <em>proportional area</em> and <em>change</em> in the basic types of land cover within municipal subzones in Singapore. The <em>image quality</em>, <em>NDVI</em> and <em>classified vegetation</em> for both survey rounds are also viewable. Note that the reported amount of <code>NA</code> pixels have been scaled up substantially (×10<sup>15</sup>) for the purpose of visualisation, so the image quality is actually pretty good (i.e., low cloud cover). This means that differences between survey rounds are unlikely to be due to differences in image quality. You may toggle the visibility of the different layers within the map.</p>
<iframe seamless="" src="assets/plot_leaflet.html" width="100%" height="500">
</iframe>
<p><br></p>
<p>Based on this comparison of satellite images captured between Survey Round One and Two, we find that:</p>
<ul>
<li><strong>Sparse vegetation</strong> increased within the Central Water Catchment and offshore islands. However, <strong>dense vegetation</strong> decreased substantially within these areas too.</li>
<li><strong>Water cover</strong> at the eastern (North Eastern Islands) and western (Tuas View Extension) tips of Singapore decreased, but increased at north-western areas, Jurong Island and the Central Water Catchment.</li>
<li><strong>Built cover</strong> decreased substantially within Jurong Island and along the south-western coast of Singapore.</li>
</ul>
<p><br></p>
<p>What factors have contributed to land cover changes during this period of time? What other interesting patterns do you see? Feel free to reach out to us if you are interested to collaborate <del>(P.S. We are also hiring!)</del>.</p>
<p><br></p>
<p>This post is also shared on <a href="https://r-bloggers.com">R-bloggers.com</a>.</p>
<p><br></p>
</section>
<div id="quarto-appendix" class="default"><section class="quarto-appendix-contents" id="quarto-citation"><h2 class="anchored quarto-appendix-heading">Citation</h2><div><div class="quarto-appendix-secondary-label">BibTeX citation:</div><pre class="sourceCode code-with-copy quarto-appendix-bibtex"><code class="sourceCode bibtex">@misc{x._p.2021,
author = {X. P. , Song},
title = {Changes in a City’s Land Cover over Time},
date = {2021-03-12},
url = {https://xpsong.com/posts/city-landcover-change/},
langid = {en}
}
</code></pre><div class="quarto-appendix-secondary-label">For attribution, please cite this work as:</div><div id="ref-x._p.2021" class="csl-entry quarto-appendix-citeas">
X. P., Song. 2021. <span>“Changes in a City’s Land Cover over
Time.”</span> March 12, 2021. <a href="https://xpsong.com/posts/city-landcover-change/">https://xpsong.com/posts/city-landcover-change/</a>.
</div></div></section></div> ]]></description>
<category>R</category>
<category>Research</category>
<category>Tutorials</category>
<guid>https://xpsong.com/posts/city-landcover-change/</guid>
<pubDate>Thu, 11 Mar 2021 16:00:00 GMT</pubDate>
<media:content url="https://xpsong.com/posts/city-landcover-change/featured.jpg" medium="image" type="image/jpeg"/>
</item>
<item>
<title>Zoom with Drums</title>
<dc:creator>XP Song</dc:creator>
<link>https://xpsong.com/posts/zoom-drum-lessons/</link>
<description><![CDATA[
<p>Recent measures to contain the coronavirus outbreak have left many local businesses reeling. I’ve been teaching drums part-time at My Drum School (MDS), but as of tommorrow, all centres will have to shut down their physical premises.</p>
<p>Thankfully, our school has already been scrambling to move to online lessons since last week. The team has been working hard to set up new equipment and release how-to videos, web resouces and online syllabus material. The complete integration of all these materials and new workflows (in just a few days!) within the MDS mobile app is nothing short of amazing.</p>
<p>During my first few online lessons, I got to experience first-hand the difficulty in playing to music/click tracks, especially with regard to sound quality. After some experimentation, I was inspired to make a couple of video tutorials showing how these issues may be addressed. Hopefully this will help teachers and students improve their online lesson experience.</p>
<center>
<p><b>Part 1</b>:</p>
<div style="position: relative; padding-bottom: 56.25%; height: 0; overflow: hidden;">
<iframe src="https://www.youtube.com/embed/TkxKjjTsBQc" style="position: absolute; top: 0; left: 0; width: 100%; height: 100%; border:0;" allowfullscreen="" title="Zoom with Drums Part 1">
</iframe>
</div>
<p><br></p>
<p><b>Part 2</b>:</p>
<div style="position: relative; padding-bottom: 56.25%; height: 0; overflow: hidden;">
<iframe src="https://www.youtube.com/embed/clH19NYKYQs" style="position: absolute; top: 0; left: 0; width: 100%; height: 100%; border:0;" allowfullscreen="" title="Zoom with Drums Part 2">
</iframe>
</div>
</center>
<p><br></p>
<p><em>Note: For an introductory video on how to set-up for online lessons, see the video released by My Drum School <a href="https://youtu.be/Pm9bQqvvu1Y">here</a>.</em></p>
]]></description>
<category>Music</category>
<guid>https://xpsong.com/posts/zoom-drum-lessons/</guid>
<pubDate>Sat, 04 Apr 2020 16:00:00 GMT</pubDate>
<media:content url="https://xpsong.com/posts/zoom-drum-lessons/featured.jpg" medium="image" type="image/jpeg"/>
</item>
<item>
<title>Analysing spatial patterns of the landscape</title>
<dc:creator>XP Song</dc:creator>
<link>https://xpsong.com/posts/intro2r-spatial/</link>
<description><![CDATA[
<p>In landscape ecology, we study interactions between organisms and their surrounding environment. The physical landscape is often quantified using ‘landscape metrics’, for further analyses alongside other sources of data.</p>
<p>My recent lectures are focused on introducing such concepts to landscape architects. While I’ve used FRAGSTATS and other GIS software for teaching in the past, I’ve always struggled with compatibility/licensing issues that just made the whole process really cumbersome.</p>
<p>Recently, I came across this R package called <a href="https://r-spatialecology.github.io/landscapemetrics/index.html">landscapemetrics</a>—I was blown away. I’ve always wanted to introduce people to spatial analyses in R, and to show how a programmatic, reproducible workflow in R can be an excellent alternative to conventional software. This was the tipping point for me… I decided to give it a shot!</p>
<p>This post provides a preview of the workshop content. The workshop is meant to be a follow-up lesson to the <a href="https://xp-song.github.io/posts/intro2r/">Introduction to R</a> series I’ll be conducting next semester.</p>
<p><br></p>
<section id="workshop-outline" class="level2">
<h2 class="anchored" data-anchor-id="workshop-outline">Workshop outline</h2>
<p><br></p>
<ol type="1">
<li><p>Why analyse spatial patterns?</p></li>
<li><p>Landscape ecology: Conceptual models</p></li>
<li><p>Land cover classification</p></li>
<li><p>Landscape metrics</p></li>
</ol>
<p><br></p>
<p>I’ve uploaded the slides and data on <a href="https://github.com/xp-song/Intro2R-spatial">Github</a>, so that anyone interested to try out the analysis for themselves will be able to do so. If you’re interested in geospatial analysis, check it out!</p>
<p><br></p>
</section>
<section id="slide-deck" class="level2">
<h2 class="anchored" data-anchor-id="slide-deck">Slide deck</h2>
<div class="cell">
<div class="cell-output-display">
<div class="shareagain" style="min-width:300px;margin:1em auto;" data-exeternal="1">
<iframe src="https://xpsong.com/resources/intro2r_spatial/slides#1" width="400" height="300" style="border:2px solid currentColor;" loading="lazy" allowfullscreen=""></iframe>
<script>fitvids('.shareagain', {players: 'iframe'});</script>
</div>
</div>
</div>
<p><br></p>
<p><br></p>
<p>This post is also shared on <a href="https://r-bloggers.com">R-bloggers.com</a>.</p>
</section>
<div id="quarto-appendix" class="default"><section class="quarto-appendix-contents" id="quarto-citation"><h2 class="anchored quarto-appendix-heading">Citation</h2><div><div class="quarto-appendix-secondary-label">BibTeX citation:</div><pre class="sourceCode code-with-copy quarto-appendix-bibtex"><code class="sourceCode bibtex">@misc{x._p.2020,
author = {X. P. , Song},
title = {Analysing Spatial Patterns of the Landscape},
date = {2020-02-01},
url = {https://xpsong.com/posts/intro2r-spatial/},
langid = {en}
}
</code></pre><div class="quarto-appendix-secondary-label">For attribution, please cite this work as:</div><div id="ref-x._p.2020" class="csl-entry quarto-appendix-citeas">
X. P., Song. 2020. <span>“Analysing Spatial Patterns of the
Landscape.”</span> February 1, 2020. <a href="https://xpsong.com/posts/intro2r-spatial/">https://xpsong.com/posts/intro2r-spatial/</a>.
</div></div></section></div> ]]></description>
<category>R</category>
<category>Tutorials</category>
<guid>https://xpsong.com/posts/intro2r-spatial/</guid>
<pubDate>Fri, 31 Jan 2020 16:00:00 GMT</pubDate>
<media:content url="https://xpsong.com/posts/intro2r-spatial/featured.jpg" medium="image" type="image/jpeg"/>
</item>
<item>
<title>Automated image classification into content-type categories</title>
<dc:creator>XP Song</dc:creator>
<link>https://xpsong.com/posts/photo-classify/</link>
<description><![CDATA[
<p>This tutorial describes the workflow and R code that can be used to classify a large number of images into <em>discrete</em> categories, based on their content. The source documents are available on <a href="https://github.com/xp-song/photo-classify">GitHub</a>. This tutorial provides supplementary information to the following publication:</p>
<blockquote class="blockquote">
<p>Song, X.P., Richards, D.R., Tan, P.Y. (2020). Using social media user attributes to understand human–environment interactions at urban parks, <em>Scientific Reports</em>, <em>10</em>, 808. <a href="https://doi.org/10.1038/s41598-020-57864-4" class="uri">https://doi.org/10.1038/s41598-020-57864-4</a></p>
</blockquote>
<p>An earlier iteration of the code was used in <a href="https://doi.org/10.1016/j.ecoser.2017.09.004">this publication</a>. Note that there are <a href="https://doi.org/10.1016/J.ECOLIND.2018.08.035">numerous other ways to classify images</a>, including those that deal with overlapping content.</p>
<p> </p>
<hr>
<p>The dataset <code>photos</code> is used as an example. It contains 50 photos with a column of photo <em>source</em> URLs. These are sent to the Google Cloud Vision Application Programming Interface (API), to generate up to ten keyword labels per photo.</p>
<p>Note that you will need to have signed-up with the Google Cloud Platform and generated your Client ID and Client secret. We will be using the <a href="https://cran.r-project.org/web/packages/googleAuthR/index.html">googleAuthR</a> and <a href="https://github.com/cloudyr/RoogleVision">RoogleVision</a> packages to interact with the API.</p>
<p> </p>
<p>First few rows of the <code>photos</code> dataset:</p>
<div class="cell">
<div class="cell-output-display">
<div class="table-responsive">
<table class="table table-striped table-condensed caption-top table-sm small" data-quarto-postprocess="true">
<thead>
<tr class="header">
<th style="text-align: left;" data-quarto-table-cell-role="th">photoid</th>
<th style="text-align: left;" data-quarto-table-cell-role="th">url</th>
</tr>
</thead>
<tbody>
<tr class="odd">
<td style="text-align: left;">29993180834</td>
<td style="text-align: left;">https://farm6.staticflickr.com/5641/29993180834_8179c87aa7_z.jpg</td>
</tr>
<tr class="even">
<td style="text-align: left;">7002246829</td>
<td style="text-align: left;">https://farm7.staticflickr.com/6240/7002246829_d114f402e7_z.jpg</td>
</tr>
<tr class="odd">
<td style="text-align: left;">5466070643</td>
<td style="text-align: left;">https://farm6.staticflickr.com/5216/5466070643_759428f4a5_z.jpg</td>
</tr>
<tr class="even">
<td style="text-align: left;">16303185765</td>
<td style="text-align: left;">https://farm9.staticflickr.com/8571/16303185765_4dd4d48b7b_z.jpg</td>
</tr>
<tr class="odd">
<td style="text-align: left;">30414187771</td>
<td style="text-align: left;">https://farm6.staticflickr.com/5503/30414187771_5283977ca6_z.jpg</td>
</tr>
<tr class="even">
<td style="text-align: left;">16065397248</td>
<td style="text-align: left;">https://farm9.staticflickr.com/8593/16065397248_7a6a0666b1_z.jpg</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
<p> </p>
<p>Plug-in your Google Cloud Platform credentials:</p>
<div class="cell">
<div class="sourceCode cell-code" id="cb1" style="background: #f1f3f5;"><pre class="sourceCode r code-with-copy"><code class="sourceCode r"><span id="cb1-1"><span class="fu" style="color: #4758AB;
background-color: null;
font-style: inherit;">require</span>(googleAuthR)</span>
<span id="cb1-2"></span>
<span id="cb1-3"><span class="fu" style="color: #4758AB;
background-color: null;
font-style: inherit;">options</span>(<span class="st" style="color: #20794D;
background-color: null;
font-style: inherit;">"googleAuthR.client_id"</span> <span class="ot" style="color: #003B4F;
background-color: null;
font-style: inherit;">=</span> <span class="st" style="color: #20794D;
background-color: null;
font-style: inherit;">"xxx.apps.googleusercontent.com"</span>)</span>
<span id="cb1-4"><span class="fu" style="color: #4758AB;
background-color: null;
font-style: inherit;">options</span>(<span class="st" style="color: #20794D;
background-color: null;
font-style: inherit;">"googleAuthR.client_secret"</span> <span class="ot" style="color: #003B4F;
background-color: null;
font-style: inherit;">=</span> <span class="st" style="color: #20794D;
background-color: null;
font-style: inherit;">""</span>)</span>
<span id="cb1-5"></span>
<span id="cb1-6"><span class="fu" style="color: #4758AB;
background-color: null;
font-style: inherit;">options</span>(<span class="st" style="color: #20794D;
background-color: null;
font-style: inherit;">"googleAuthR.scopes.selected"</span> <span class="ot" style="color: #003B4F;
background-color: null;
font-style: inherit;">=</span> <span class="fu" style="color: #4758AB;
background-color: null;
font-style: inherit;">c</span>(<span class="st" style="color: #20794D;
background-color: null;
font-style: inherit;">"https://www.googleapis.com/auth/cloud-platform"</span>))</span>
<span id="cb1-7">googleAuthR<span class="sc" style="color: #5E5E5E;
background-color: null;
font-style: inherit;">::</span><span class="fu" style="color: #4758AB;
background-color: null;
font-style: inherit;">gar_auth</span>() <span class="co" style="color: #5E5E5E;
background-color: null;
font-style: inherit;">#You will be directed to a weblink to sign-in with your account</span></span></code></pre></div>
</div>
<p> </p>
<hr>
<section id="generate-keywords" class="level1">
<h1>Generate Keywords</h1>
<p>Create a loop to send each photo URL to the Google Cloud Vision API, and append the results to <code>photos</code>:</p>
<div class="cell">
<div class="sourceCode cell-code" id="cb2" style="background: #f1f3f5;"><pre class="sourceCode r code-with-copy"><code class="sourceCode r"><span id="cb2-1"><span class="fu" style="color: #4758AB;
background-color: null;
font-style: inherit;">require</span>(RoogleVision) </span>
<span id="cb2-2"></span>
<span id="cb2-3"><span class="co" style="color: #5E5E5E;
background-color: null;
font-style: inherit;">#add extra columns for 10 x 3 rows of data (keyword, probability score, and topicality score)</span></span>
<span id="cb2-4">photos[,<span class="dv" style="color: #AD0000;
background-color: null;
font-style: inherit;">3</span><span class="sc" style="color: #5E5E5E;
background-color: null;
font-style: inherit;">:</span><span class="dv" style="color: #AD0000;
background-color: null;
font-style: inherit;">32</span>] <span class="ot" style="color: #003B4F;