Difference between revisions of "CV - Work experience"

From Sinfronteras
Jump to: navigation, search
Line 14: Line 14:
 
<!--begin--#######################################################################-->
 
<!--begin--#######################################################################-->
 
<!--begin--=======================================================================-->
 
<!--begin--=======================================================================-->
| style="color: white; width: 12%; background-color: #4081de; padding: 30px 20px 40px 20px; border:1px solid #ddddff; vertical-align:center; border-radius: 5px 0px 0px 0px;" |
+
| style="color: white; width: 12%; background-color: #4081de; padding: 30px 20px 40px 20px; border:1px solid #ddddff; vertical-align:center; border-radius: 4px 0px 0px 0px;" |
 
<div style="text-align: center; font-weight: 540;">
 
<div style="text-align: center; font-weight: 540;">
 
<!--begin--------------------------------------------------------------------------->
 
<!--begin--------------------------------------------------------------------------->

Revision as of 09:36, 16 March 2021





Present

2017

IDG Direct, Ireland

Business Development Executive

  • This is a pre-sales role. I represent IDG services by making professional outgoing calls to prospective clients. I have to establish and maintain a professional conversation with IT Managers to identify their needs and next investments. The gathered information is required from our clients (Largest Tech Companies) and used in the next step of the sales process.


  • My responsibilities include: Lead Generations, Gathering client details and Maintaining/Updating IDG database with accurate client details.
  • I work in different markets and contact clients in French, English and Spanish: France, Belgium, Luxembourg, Spain, Middle East and Africa.
  • In this position, I have improved my communication skills in French and English. I have learned how to build and maintain a professional relationship with clients and improved my Active Listening Skills.
  • At IDG, I have completed a Certified Sales training. This course addressed the most important aspects of the sales process.


Communication and Sale Skills

  • I have to call IT Manager to gather information about their investments. To do so, I have to establish and maintain a professional conversation with IT Managers in order to identify their needs and the next investments. The gathered information is required from our clients (IT Companies: IBM, DELL, Net App, etc) and used in the next step of the sales process.
  • Let's say that IBM is looking to sell a particular product (A Cloud backup solution, for example). So, IBM requires IDG's services, asking for a number of contacts (IT Managers) that are planning to invest in backup solutions. Then, we establish a professional conversation with IT Managers from our database and identify those that are looking to invest in the product required for the client.
  • In this position, I have improved my communication skills in French and English. I have learned how to build and maintain a professional relationship and improved my Active Listening Skills.
  • During the phone conversations, I have to explain the topic of the product that our clients are looking to sell and be able to handle objections. That is why this experience has allowed me to be aware of the latest solutions and technologies in which the most important IT companies are working on.
  • At IDG, I have also completed a Certified Sales training. During this course, I have learned and put into practice, the most important concepts of the sales process.
  • Prospecting, Preparation, Approach, Presentation, Handling objections, Closing, Follow-up
https://www.lucidchart.com/blog/what-is-the-7-step-sales-process


Target and KPI

  • At IDG we need to generate what we call a «lead». A lead is a conversation that matches the criteria asked for the client. For example, if the client (Let's see IBM) is asking for contacts that are looking to invest in Backup solutions, then every time that we have a conversation in which the contact confirms to be looking for backup solutions; this contact represents a «lead».
  • At IDG we have to reach a daily target of about €650 per day. So each lead that we generated has a price, and we need to generate as many leads as needed to reach the target of €650. So normally an easy lead worth about €65 and a complicated one about €180.
  • So, every day we need to fight to reach the target performance. We usually have many challenges to reach the target performance:
  • Data challenges: We make calls using particular data that has been prepared for a particular campaign. Many times you can make many calls but you don't reach the contacts that you are looking for. So you can spend your day making calls but not having conversations with the IT Manager. So if you are not reaching the contact, you can not make leads.
  • Hard campaign challenges: That means that we have a campaign in which the client is asking for a difficult criterion. Let's say, for example, that the client is asking for contacts that are looking to invest in a particular solution (SAP applications for example). That represents a campaign challenge because we have to reach a contact that is looking to invest, specifically, in this solution.
  • Solutions: There are a few techniques that we use to apply when we face the challenges. Change the data or the campaign you're working on is the first action we can take. But sometimes you can not change the campaign because we really need to deliver lead for those campaigns because we need to reach a certain number of leads the client is asking for. We usually make calls using a platform that makes the calls automatically taking the contact from the database related to the campaign you're working on. So usually we don't need to worry about the criteria (company size, job title, industry) of the contacts we are calling because the platform makes the calls. But when you have data problems, the solution is to research for contacts manually. So, that is a little tricky because you can try to call the best contact by doing manual research in the database, but you can spend a long time doing this research and that doesn't assure that you are going to reach the contact and get leads. So when you have good data you have to use the platform, otherwise, you should search for contacts manually. So in this manual research is where you have to propose ideas and develop a good methodology to be able to find good contacts and get leads. One of the techniques we apply when we have a hard campaign is, for example, if we get a lead from a particular company; we try to call other contacts from the same company because we know that this particular company is going to review in the product that the client is looking for.
The other approach is to try to search new contacts on the internet (usually on Linkedin), but that is even more tricky because it is complicated to get reach a new contact and to get the lead. Here is where I wanted to say that I had an important contribution. So the problem with this external research is that most of the contact that you are going to find on Linkedin is already in our database. So it doesn't make sense. But I realized that when we are looking for business job titles (because sometimes we have campaigns in which the client is asking for business titles) it makes sense to do external research (on Linkedin) because our database is composed mostly for IT Professionals (we have some business contacts in our database, but not a lot) so the chance of finding a contact on Linkedin that is not in our database increase a lot. Therefore, it makes sense to do external research when looking for business contacts. By doing that, I was able to get a good number of leads for hard campaigns; and that is a concrete contribution that I made to my team.

2014

WikiVox, France

Web Programmer

  • I was responsible for the installation and administration of a Wiki Web Application based on the MediaWiki engine.


  • Extensive experience with the MediaWiki Engine:
  • Configuration of a Multilingual Wiki.
  • User access levels configuration.
  • Implementation of different CAPTCHA methods.
  • Implementation of a payment gateway.
  • Page categorization.
  • Take a look at my personal Wiki: http://wiki.sinfronteras.ws
  • Administration of a Linux Server:
  • Installation and configuration of a LAMP stack: Apache, MySQL, PHP.
  • Database management:
  • MySQL, PhpMyAdmin.


WikiVox is a nonprofit organization whose goal is to create a website (a wiki) for debates of political, economic and environmental topics. They want to create a discussion method capable to generate, at some point in the debate, an article with precise suggestions, in order to contribute to the solution to the problem.

When I was working at WikiVox, the project was just starting. The philosophy of the project was already mature, but the implementation of the Wiki was just in its first phase.

It was a very nice experience. I liked very much especially the philosophy of the project.

And... I think that working in a small organization was positive at this point in my career Because I had responsibilities that I am sure I would not have had in a big company; that's why I think that I learned a lot from them.

I had responsibilities related to (1) the administration of a Linux Web Server and (2) to the design of the website.

  • About Linux administration, my responsabilities were regarding the installation and administration of a LAMP stack (Apache, MySQL, PHP) on a Linux Server.
  • About the design of the website, we used free software (Wikipedia Software). I was responsible for the installation and administration of a Wiki Web Application based on the MediaWiki engine. Some of the functionalities that we
  • We had to install a LanguageSelector and translate the content into 5 languages: French, English, Spanish, German and Arabic.
  • We had to install an extension to make donations (I mean to pay online). The payment gateway for implementing a donation service.
  • An extension to categorize pages.
  • I also had to program in PHP.


Wiki - Organize information into a cohesive, searchable and maintainable system.

  • One of the most important skills I have, which I usually find complicated to make understand its importance, is my Wiki management skills.
  • A Wiki is a website on which users can collaborate by creating and modifying content from the web browser. So, the best example is Wikipedia. In Wikipedia someone can create a article and then it can be modify online for other users. A Wiki is an outstanding tool to organize information into a cohesive, searchable and maintainable system that can be accessed and modified online. The benefits of a wiki to organize information are remarkable.
I have a personal Wiki (based on the MediaWiki engine) where I document everything I'm learning and working on. So, I use a Wiki as a Personal knowledge management that allows me to organize information into a cohesive, searchable and maintainable system. The benefits that I've had using a Wiki are amazing. It has allowed me to learn in a more effective way; and most importantly, to constantly review and improve in important topics by providing a very convenient online access (so from anywhere) to an organized and structured information.
Take a look at some of my Wiki pages: http://perso.sinfronteras.ws/index.php/Computer_Science_and_IT

2012

2011

Simón Bolívar University - Funindes USB, Venezuela

Research geophysicist of the Parallel and Distributed Systems Group (GRyDs)

Click here to see some examples of my work in Seismic modelling.

  • As a Research Geophysicist, I was responsible for performing a set of signal analysis (seismic processing) tasks and ensuring the correct integration and implementation of geophysical applications into a computer cluster platform. This platform was being designed in order to facilitate task scheduling and run computation-intensive tasks on clusters. One of my main activities was shell script programming for Seismic Modeling and Processing.


  • My responsibilities include:
  • Shell script / MATLAB programming for signal analysis (seismic data processing and modeling).
  • Simulations of seismic waves propagation: Wavefront and ray tracing.
  • Generation of pre-stacked synthetic seismic data using wave propagation theories (raytracing and finite difference methods).
  • 2D/3D Seismic data processing:
  • Deconvolution
  • Auto-correlation, Cross-correlation
  • Analysis of signal noise reduction: time/frequency domain transforms
  • Task automation using Shell scripting.


Task automation using Shell scripting: Here I could mention the generation of images to create seismic waves propagation videos or the automatic generation of pdf reports using latex that contained details about the executed process: time vs. the features of the data generated (the amount of data generated).


I have skills in Matlab, Scilab and Shell scripting that I got during my participation in an R&D Unit at Simón Bolívar University (The Parallel and Distributed Systems Group - GryDs).

MATLAB (matrix laboratory) is a language and numerical computing environment. MATLAB allows data analysis and data visualization, matrix manipulations, and performing numerical computations. Matlab contains a huge library of functions that facilitate the resolution of many mathematical and engineering problems. For example, I used it for Signal Analysis, specifically for Seismic data analysis. it for Ex. 1 and Ex. 2:

  • Signal Processing in Geophysics
  • Ex.1: That allows defining the coordinates of the layers of a geological model by opening an image file of the geological model and selecting, by clicking with the mouse, a set of points (or coordinates) that define each of the layers of the geological model. These coordinates will be saved in a very particular format that will be used as input of another program that is in charge of building the Geological model entity used by another program to perform a Seismic Wave Propagation Modelling.

2011

2010

CGGVeritas, Venezuela

Seismic data processing analyst

  • Demultiplexing, Reformatting (SEG -Y/SEG -D).
  • Seismic data edition: Searchin for noisy, monofrequency and incorrect polarities traces.
  • Geometrical spreading correction. Set-up of field geometry.
  • Geometry QC.
  • Application of field statics corrections, Deconvolution, trace balancing.
  • CMP sorting, Velocity analysis, Residual statics corrections.
  • NMO Correction, Muting, Stacking, Filtering.
  • Filtering: Time-variant, band-pass.
  • Post-stack/Pre-stack time and depth migration.

2010

2008

Simón Bolívar University, Venezuela

Academic Assistant - Earth Sciences Department

  • As a Academic Assistant, I was in charge of collaborating with the lecture by teaching some modules of the Geophysical Engineering program at Simón Bolívar University. I was usually in charge of a group between 20 and 30 students during theoretical and practical activities.


  • This experience has contributed to my professional development in two major areas:
  • By teaching modules, I have solidified many technical geophysical knowledge.
  • I have also developed communication and presentation skills, as well as the leadership strategies needed to manage a group of students and to transfer knowledge effectively.


  • Courses taught:
  • Seismic data processing: Concepts of discrete signal analysis, sampling, aliasing and discrete Fourier transform. Conventional seismic data processing sequence.
  • Seismic methods: The convolutional model of the seismic trace. Propagation and attenuation of seismic waves. Interpretation of seismic sections.
  • Seismic reservoir characterization: Relations between the acoustic impedance and the petrophysical parameters. Well-Seismic Ties. Seismic inversion and AVO.


I have three years of experience as an academic assistant in the courses of Seismic Processing, Seismic Reservoir Characterization, and Seismic Methods.

During my experience as an academic assistant, I have solidified my knowledge of the theoretical basis of seismic processing. In particular, all the technical concepts that are required for this position, such as Seismic velocity analysis, Multiples, Surface statistics correction, Noise attenuation, and Imaging.

During my experience as a teacher assistant, I was assigned three times to teach the Seismic data processing course. My work was to give theoretical and practical lessons. The theoretical part was focused on signal theory: Concepts of discrete signal analysis, sampling, aliasing, and discrete Fourier transform, and all the theoretical aspects of each stage of a conventional seismic processing sequence. And in the practical part, the students had to process a 2D seismic data set. We used the Seismic Unix software. It's a free software developed for the Colorado School of Mines.

I was the assistant of the teacher in charge. But I was responsible for a large part of the course since I have participated three times in this course.