All about me and the various projects that I have been involved in across the different industries and continents
A “smart city” is an ultra-modern urban area that gathers data through electronic means, voice activation techniques, and sensors. The data is then used to better manage the city’s assets, resources, and services, which in turn leads to better citywide operations.
To keep tabs on and manage things like traffic and transportation systems, power plants, utilities, water supply networks, waste, crime detection, information systems, educational institutions, health care facilities, and more, data is gathered from residents, devices, buildings, and assets and then processed and analysed. This data is collected by means of big data, and then the complex characteristics of a smart city may be put into effect with the aid of advanced algorithms, smart network infrastructures, and numerous analytics platforms. For traffic or stadium sensing, analytics, and management.
Several governing bodies, locally & internationally, are striving to achieve smart city status as we speak.
One of the most innovative ideas for a big data project is presented here. The purpose of this Big Data project is to research visitor behaviour to ascertain the preferences of tourists and the locations that are visited the most, as well as to anticipate the need for tourism in the future.
Large amounts of data play an enormous part in this because holiday makers use the internet and other technologies when they are away from home, leaving digital traces that can be easily collected and distributed. The vast majority of this data comes from social media platforms. It is just too much for a conventional database to manage which is why big data analytics is needed. The data is pooled from multiple sources and can be used by companies in the hotel, airline and tourism wider industry to market their products & services and to expand their client base. It can also help to build a comprehensive picture of current and future trends that help them to forecast effectively.
Anomaly detection is a valuable instrument for cloud platform administrators who wish to monitor and analyse cloud behaviour to increase cloud reliability. It aids cloud platform administrators in detecting unanticipated system activity. This enables providers to take preventative measures prior to a system breakdown or service failure.
A project such as this provides a reference implementation of a Cloud Dataflow streaming pipeline that integrates with BigQuery ML, Cloud AI Platform, to detect anomalies. A critical component of the implementation utilises Dataflow for feature extraction and real-time outlier detection.
By: https://www.linkedin.com/learning/learning-bigquery/the-bigquery-cost-structure?autoSkip=true&resume=false&u=83075154 Kishan Iyer, Content Engineer, DevOps, and Google Cloud Platform Power user
The significance of digital marketing and attention-grabbing content is essential in the modern age. To aid equality and engagement digital content should be accompanied by subtitles. Large datasets containing photos and captions that are correlated must be managed. Image processing and deep learning are used to comprehend the image, and artificial intelligence is used to generate relevant and accurate captions. Python source code, for Big Data, can be written for purpose.
This one is not for the faint hearted and is certainly one of the more “advanced” of the project ideas.