Highly analytical and process-oriented Data Analyst with a master’s degree in computer science, with a concentration on Data Science and Data Analysis and 2+ years of experience as a Data Analyst with a background of Data Analysis, Dashboards, Databases, and Reports Creation. • Proficient in Databases and concepts such as Data Cleaning, Data Visualization, Designing, Data Testing, Query Optimization, and Generating interactive reports. • Proficient with tools like Tableau, MS SQL Server Management Studio, and MS SQL Server with good knowledge on Python, Java, HTML, and CSS. • Proficient in SQL querying and worked extensively to meet end-user requirements by producing complex queries with Union, Joins, Sub-Queries, and Views. • Proficient Experience in Python Libraries such as Numpy, Pandas, SciPy, Matplotlib, Seaborn, Machine Learning with Scikit-Learn, and Neural Networks. • In-depth knowledge of SAP R/3 Integration, SD Integration with other modules like FICO, MM, PP. • Quick Learner, initiator, passionate, creative and capable of working and communicating with people at all levels of the organization with a sense of responsibility with self, team, and client; possessing excellent analytical and problem-solving skills with an ability to handle multiple projects in a fast-paced, deadline-oriented environment.
-
Experience
EMPLOYMENT HISTORY:
DATA ANALYST | FERVID DIGITAL MEDIA AGENCY
01/2017 to 12/2017
• Worked on multiple large-scale projects in the areas of sales analysis using supervised and unsupervised Machine learning algorithms in python while gaining insights into the data from various Data formats.
• Generated complex SQL queries to retrieve data from the databases and automated the querying process with the VBA script. • Performed ETL job on a daily batch of data and made transformations on data based on client requirements.
• Performed data pre-processing techniques like data cleaning, data transformation, and data reduction on huge datasets, imported from various data sources like Excel, CSV, JSON, and text files.
• Worked with Python and Tableau to develop, verify, and visualize the results using data manipulation, machine learning algorithms, and visualization libraries to create various plots used for statistical data exploration.
• Advocated data analytics best practices with a focus on consistency and reusability. Assured that the company’s growth is validated every month.
• Performed data analytics on the company’s sales records and helped in the evaluations of the company’s progress through consistent presentations about the analysis and growth of the company.
• Collaborated with user interface and user experience teams. Collaborated with the design team to define the information architecture. Created clean, precise browser-compatible code and designed functionality. Developed new user-facing features.
INTERN | HCL CDC
05/2016 to 06/2016
• Intern at HCL CDC for a project related to VLSI and C++ Programming.
• The main project during this Internship was to use programming languages (C and C++) to run the hardware efficiently.
• This project was successfully completed with efficiency, hard work and coordination of other team members.
GRADUATE TEACHING ASSISTANT | NORTHERN ILLINOIS UNIVERSITY
09/2018 to 12/2018
• Teaching assistant at NIU under a professor helped students to learn the data analytics efficiently.
• Helped students to learn Python and its libraries like Numpy, Pandas, and Tableau for Data Analysis and Data Visualization.
• Created and conducted several assignments and quizzes for students, also helped the professor by giving many presentations to students regarding data analytics.
• Conducted several lab sessions, where students used to perform data analysis on various types of data.
GRADUATE RESEARCH ASSISTANT (DATA ANALYST)
01/2019 to 12/2019
• Collected the course rubrics from all the courses in the department which were imported from Excel, CSV and text files and analyzed the data using python libraries.
• Used machine learning algorithms to predict the performance of the courses and the department for the then-upcoming semesters.
• Used Tableau to create interactive visualizations for the analysis performed on the courses of the department which were used in the presentations.
• Analyzed the performance and progress of the department every semester by collecting data from the student’s performance on their courses.
• Analyzed student performance on all courses in the department and analyzed the faculty’s performance by comparing it with the previous semester’s analysis.
• Conducted surveys for all courses in the department at the end of the semester to retrieve the student feedback on the course and the faculty. This data was compiled and analyzed to produce the performance of the course, faculty, and students.
• Gave presentations to the board of department at the end of every semester regarding the performance of all courses and faculty, also pointing out the steps to be taken to improve the performance of the overall department.
-
Projects
ACADEMIC PROJECTS:
SOCIAL MEDIA ANALYSIS: Performed exploratory data analysis with the use of Numpy and Pandas libraries and data visualization libraries such as Matplotlib and Seaborn to find out the features which have a major influence on the paper popularity. Performed Linear Regression and Neural Networks machine learning algorithms to analyze the gathered data for its accuracy. Completed the data analysis of which factors were more likely to contribute to the popularity of scholar articles on a social media platform like Facebook
NEWYORK CITY BIKE SHARING:
Performed Data Analysis on the New York City Bike Sharing impact on the population and its effectiveness on the usage of public transport over the past decade. Used several python libraries for Data Analysis and Data Visualization Techniques such as Matplotlib and Seaborn and Visualization Tools Like Tableau and Power BI for effective visualization of the patterns of Bike Sharing throughout the year and in different seasons. It was a challenging project with huge data sets received from online databases of the New York City Bike Share program.
NPR PROJECT:
Analyzed the financial data of the different types of educational institutions in the USA and the types of scholarships given to the students and the trends of economy flow throughout the years with several python libraries like Numpy, Pandas, and data visualization techniques like Seaborn. Identified the trends using Neural Networks. The visualization tools like Tableau and Power BI were used to produce effective visual patterns and trend flows of the financial data of the educational institutions.