
Software Developer
ncircleTechTopic modeling is an unsupervised machine learning technique thats capable of scanning a set of documents, detecting word
and phrase patterns within them, and automatically clustering word groups and similar expressions that best characterize a set of
documents. Here with the help of Gensim library and LDA algorithm is used for the topic visualization and building the model.
Responsibilities:
Understood and analyzed the customer requirements.
Develop NLP systems according to requirements
Train the developed model and run evaluation experiments
Perform statistical analysis of results and refine models
Extend ML libraries and frameworks to apply in NLP tasks
Build data pipelines to prepare data for rapid learning in a salable manner
Design and build machine learning and data infrastructure by partnering with data and production engineering teams.
Develop new complex features to be used to build next generation of machine learning models by combining techniques
and business acumen. Partner with the production engineering team to deploy models, strategies, verification methods
Communicate and partner with third-party data vendors
Scan to BIM means capturing a space, and turning it into a digital model that can be used for planning, monitoring or managing
the built environment, and communicating and sharing project information with stakeholders. GeoSLAM Scan to BIM is a simple
way of rapidly capturing essential information about a space, creating a point cloud and importing the data into a BIM software,
like Autodesk Revit.
Responsibilities:
To create and construct methods and plans for machine learning.
ML systems and models should be trained and retrained as necessary.
To improve and broaden current ML frameworks and libraries.
To create machine learning applications in accordance with client or customer needs.
To investigate, test, and put into practice appropriate ML tools and algorithms.
To evaluate the application cases and problem-solving potential of ML algorithms and rank them according to success
likelihood.
To better comprehend data through exploration and visualization, as well as to spot discrepancies in data distribution that
might affect a models effectiveness when used in practical situations.
Understood and analyzed the requirements Clean The polygonal mesh files for further operation Convert point cloud files
for further operation like .txt , .asc , .laz format Labeled the clean data for preprocessing,Prepare the documentation for
the process
According the requirements of client build installer