Another Perspective On Science

Photo by Jonas Verstuyft on Unsplash

Science, as we have come to it, is about innovation and the way forward. It may be defined both as a process and as an outcome – the process of obtaining knowledge and the knowledge that is obtained. Thomas Kuhn, a Physicist and historian of Science, hints at this duality when he says that Science is “the constellation of facts, theories and methods collected in current texts”, while “scientists are the men (and women) who, successfully or not, have striven to contribute one or another element to that particular constellation.”

Science is essentially believed to be “value-free” and without any biases. It is deemed as the quest for truth and new knowledge however in this search for truth lie some prerequisites which if fulfilled enable the quest for knowledge. One of these prerequisites is funding for research which governments only give to research for “hard sciences.” The question this poses however, is that is it really a way forward when limitations have been imposed on what is to be studied and understood, and what is to be unravelled? 

The answer to this can be found in the predominant economic system that prevails. When one views science as the primary implement of improving the world and also sees science as a competition, capitalism seems like a perfect dance partner. Scientists can be pushed to develop life-saving and life-sustaining products by the joint allure of profit and personal recognition. The capitalist society creates the perfect situation of supply and demand and through the competition that it generates it reaps profits. Science is used as an instrument to aid a profit agenda as opposed to serving humanity because capitalism has associated a value with everything that exists from human life itself to what it needs to survive.

Consequentially both negative and positive outcomes have been observed however, in the present day associating profit with science has led to a loss for humanity. One can better understand this through an example.

Infectious diseases that used to be easily treated are now able to kill once again. 700, 000 people die a year from antibiotic-resistant bacteria. Yet company after company have abandoned their antibacterial research programs and today only four main companies have an active antibacterial research program. This is due to the concept of relative value that has been associated that has led to such a change.

One can conclude by saying that the majority of the countries are in a quest to promote great technological advancements in all walks of life. However, if responsible research is conducted and if the driving force is more about serving humanity and its welfare as opposed to profit generation and elimination of competition the outcome would be far greater.

No tags for this post.

Related posts

Open-source RT-Thread IoT OS Launches its Embedded Integrated Development Environment

RT-Thread, an open-source embedded real-time operating system that has launched its development tool: RT-Thread Studio. RT-Thread Studio is built on Eclipse but has innovative interface interaction designs and it is deep customization of Eclipse, easy and simple to use, even new developers can easily get started.

RT-Thread Studio has the features of project creation and management, code editing, SDK management, RT-Thread configuration, build configuration, debugging configuration, program download and debug.

Also, it combined the graphical configuration system with packages and component resources, reducing the duplication of work and improving the development efficiency.

RT-Thread Studio Main Features:

  • Free: The community version is free forever.
  • Language Supports: Supports mainstream C/C++ language development
  • Simple Project Creation Wizards: Using two wizard modes to quickly start creating a project without having to port RT-Thread.

Based on the development board to create a project, can quickly validate prototypes of functionality;

Based on the chip to create a project, it can automatically generate drive code, supports the STM32 full-series microcontroller.

Easy-to-use graphical configuration interface & code generation

RT-Thread has rich reusable components, those components can be arranged from top to bottom, and present them in a pattern layered in the architecture diagram. All commonly used components have corresponding icons that can be easily operated with the click of a mouse. When saving the Configuration Settingsit will automatically generate all the code for you.

For example:

  • One-click to switch component module.
  • Right-click to view the dependencies of components, API documentation, and view the online tutorial videos.
  • Double-click to open the detailed configuration of the component graphical configuration, which greatly reduces the threshold of using RT-Thread, even if you’re new about RT-Thread, you can easily get started.

Easy to download program and debug

To be closer to developers’ usage habits in using MDK/IAR, RT-Thread adds the functionality of one-click download and debugging, as well as supports ST-Link/J-Link emulators. Also, RT-Thread Studio integrates a variety of terminal tools, making it easier for you to view the logs, and what’s more, you can directly use the finsh command line on the IDE now.

Software package market offers a variety of package resources

RT-Thread has offered a wide range of software packages, and we’ve innovatively used web interaction method to quickly sort the exact packages you want, by following these steps:

  • Search Package
  • View Document
  • Download Code
This is simplifying the software package usage process, giving you a development experience as if building blocks.

Powerful code editing and refactoring functionality

RT-Thread supports powerful code editing functions, such as, automatic code assistant, code template, code formatting, code refactoring and so on, which improves coding efficiency and makes code more disciplined and standardized.

Rich debugging facilities to quickly view and track code issues

Supports a variety of emulators, and integrated multiple types of terminal tools to view chip core registers, peripheral registers, variables, expressions, assembly code, memory data, etc. It also supports the assembly step and breakpoint management to quickly view and locate code problems.

SDK Manager supports online downloading and updating the latest source package of RT-Thread

The SDK Manager enables fast online download of RT-Thread stable version code packs while updating the latest version of RT-Thread code simultaneously.

Recently, RT-Thread’s global website has been launched, marking that RT-Thread’s global operation is beginning. For RT-Thread, developers are a very important part of the RT-Thread open source ecosystem, so next, RT-Thread hopes to make more connections with the global developers and work together to make RTOS great.

Quick download RT-Thread Studio, please visit www.rt-thread.io

RT-Thread Contact Info

No tags for this post.

Related posts

Why Do We Need a New Product for Geospatial Analytics for Mobility

With the prevalence of GPS all around us & the rise of smartphones, location data is being collected in abundance. Legend says almost all of the data (~80%) that companies store has a location component.

Our lives are now filled with products and services available at the tap of a phone, getting things delivered in minutes to our doorstep! Today, it is hard to find an app which doesn’t ask for your location permissions.

There is a famous quote in the GIS (Geographic Information Systems) world which says “everything happens somewhere.” When you place an order from any food delivery app, every event starting from placing of the order — to delivery executives (DE) getting assigned to — the DE picking it up from the restaurant — to him delivering happens at a certain lat-long!

Which brings me to my next point:

Firstly, there is no “mobility” without dynamic location. In other words, location is a fundamental aspect whenever there is movement of assets (people, vehicles, cargo, parcel) on the ground. After all, in cities and towns where things change at every sq km in a city, it’s important to add the “context” of those areas.

For the on-demand economy that we live in, analyzing location data across all your dimensions (users, stores, partners) becomes critical in real-time. This is because you have to match your supply with demand, do location-based pricing, promotions or faster and more accurate deliveries at scale!
And finally, the most successful location-based experiences for consumers are based on sequential activities at a very granular level: Where are you? Where do you need to go? What do you want to do when you get there? This becomes important when you need to acquire customers by targeting ads and retaining and engaging them by studying their behavior.

Location is not only about a point on a map. It is about a line. It is about movement.

But all this time, the location component has been particularly neglected in business decisions. This is because conventional analytics tools are fantastic for the treasure trove of statistical data that businesses have but somewhere, they fall short of location-based decisioning.

You might be wondering: “But, why?”

Before we deep-dive into that, let’s understand how geospatial data is different from statistical data.

Location is exciting! If played right, it can drive significant revenue improvements (case in point: Uber & Airbnb). Its high usage and real-time nature make it really valuable and sticky.

Just like text, sound, image are different kinds of data, Latitude-Longitude is a different kind of data that can add immense depth, meaning and insights to statistical data in a space-time context.

Statistical data comes in a tabular format and usually, is comprised of two elements: values across time. Geospatial data (also called geographic data or location data) is often formatted as points (latitude-longitude coordinates), polygons or polylines.

It incorporates the third dimension on top: values across time and dynamic location — which requires a completely different approach and treatment. Computing metrics for static assets involves plotting points on a map and calculating metrics for that. Movement analytics concerns itself with how do we visualize, analyze and optimize how things move on the ground!

Two special properties of geospatial data are autocorrelation and non-stationarity, which make it difficult to meet the assumptions and requirements of traditional (nonspatial) statistical methods, like OLS regression.

Spatial Autocorrelation means how similar (correlated) are my measures between nearby observations. It falls in-line with the first law of geography as well- everything is related to everything else, but near things are more related than distant things”.
Spatial nonstationarity is a condition in which a simple “global” model cannot explain spatial relationships between the variables. The nature of the model alters over space and behaves differently in different parts of the study area.

Let’s answer our pending question now:

Today, the current way of dealing with location data inside companies is broken. Due to the dearth of any location analytics products out there, most of the businesses have no choice but to rely on traditional BI and analytics tools. It’s not puzzling that this is a highly ineffective strategy because these tools are not really meant for geo-analysis.

1. Formats

Most companies collect location log data in the form of pings. The way to analyze the patterns they encompass is to have an infrastructure that can ingest these pings in real-time — something that platforms like Periscope or Tableau don’t cater to.

Moreover, location data is present across disparate databases, in different structures. Hence, slicing and dicing variables across tables in real-time comes even more complex.

2. Visualizations

All of us know that a statistical dashboard contains bars and charts which sprout from carrying out operations (sum, count, divide, average, etc) on variables. While getting live trend updates through spikes and dips on graphs might be helpful, these charts are work better on aggregated historical data.

Adding or dividing lat-longs and creating bars and charts on them is pretty futile. To make sense of these lat-longs, you need to have a map by your side to understand their spatial distribution!

Another important aspect that governs all the properties of a geospatial dashboard is a layer. Not having this layering mechanism which you can use to display multiple types of data points on the map, sort of misses the point. For instance, layers help in viewing how my orders (first layer) and partners locations (second layer) are distributed across my area clusters (third layer).

Maps are also more insightful to draw inferences than bars and charts if there is movement of components on the ground involved. Hence, real-time geographic analysis when everything is dynamic becomes fundamental.

3. Aggregation

Traditional BI tools like PowerBI and Geckoboard offer the capability to plot points on a map. However, just plotting points on a map is not adequate as billion dots in space is not very intuitive. Moreover,

Location intelligence is so much more than tracking and plotting points on a map!

Strategies like clustering, heat mapping, aggregation, indexing, etc. come in handy to absorb a large number of points.

Some tools like Tableau and Periscope also allow the creation of heat maps — a fantastic way to depict the patterns of metrics. The disadvantage of heatmaps is that they are only a visual representation, thus restricting you to do any sort of analysis on it.
To know more about why heatmaps don’t work, check this out!
A more efficient way to do aggregation is indexing them on hexagonal grids or geohashes. Once you analyze the pattern of your metrics (such as demand and supply) across grids, you can use the cells as a single unit in your models as well. Indexing also helps to go very granular in your analysis.

4. Data Preparation

Cleaning: Using Periscope and Thoughtspot, you can clean your statistical data by taking care of the blanks, spaces, data formats, NAs etc whereas cleaning of GPS data involves snapping it back to the nearest road or correcting for spatial outliers. (You must have observed often while using Google Maps, GPS goes off very far randomly.) It is safe to say that GPS as a technology still has miles to go!
Merging: Platforms like Tableau Prep or Metabase allow you to merge two tables using an inner join, left join or a right join on the basis of a common identifier. It’s quite difficult to do spatial merges using these platform if, for instance, you have data across three dimensions: users, delivery partners, and stores (sometimes they come in different formats).
spatial join involves inserting columns from one feature table of a layer to another in the spatial perspective. For example. merging a Kormangala area polygon with all the ride pick up points inside it.

Enriching: Enriching of any data implies adding new layers of information and merging it with third-party or external data sources. In the GIS world, we enrich spatial data for a better context of areas in which the points are present. This means adding the environmental layer of roads, traffic, weather, points of interest, demographics, buildings, etc.

Some companies, of course, realize this and hack around open-source tools (like Kepler.gl or QGIS). But these open source tools come with their own list of constraints and limitations.

However, the issue doesn’t get resolved here because geospatial data itself comes with a bucket full of challenges.

Performing geospatial queries on streaming data become very compute-intensive and legacy technologies (like ArcGIS) provide very little support. The complexity increases with visualizing large geospatial datasets with any sort of interactivity at scale.

Sometimes developers also build their own internal tools, but most of the times they are not well suited for all different audiences inside the company. Since the tools are not built in a scalable way, maintaining these suck up a lot of developer bandwidth often!

A lot of times there is even a repetition of effort and the wheel keeps getting re-invented over and over again. As Jeff Lawson from Twilio said — “It is easier than ever to build software but harder than ever to operate it”.

It all started with a personal problem. As data scientists working with geospatial data, the existing analytics products were futile in our daily workflows. Hence, we had to build our own tools and libraries for our everyday workflows.

We then realized data scientists around the globe face similar problems when it comes to location data. As a result, businesses are struggling to attain operational efficiency!

At Locale, we plan on solving these problems once and for all. We want to make all of these processes less painful and building a world where it is very easy to get all your geospatial answers in minutes! That’s why we are going back to the drawing board and handcrafting this experience completely from scratch.

So, the next time you want to order medicines in case of an emergency, you won’t hopefully read on the screen, “Delivery guys are not available. Please check again later.” Next time the delivery guys won’t have to stand idle in the scorching heat, cold or rains waiting for the orders to come.

They can be incentivized to move to high demand areas and can earn more money. The push notifications that you get won’t be spam — they will be shot to you at the right place and right time.

Read Similar:

  1. Bridging Supply-Demand Gaps in Last-Mile Delivery Companies Geospatially [Link]
  2. Carto vs Kepler vs Locale: Which product to use for geospatial analytics? [Link]
No tags for this post.

Related posts

How to Find Out If Your WordPress Site Was Hacked

Photo by Ludovic Toinel on Unsplash

There are signals that you can notice to check if the security of your site is intact. By taking action at the right time, you can protect your online presence from any severe damage.

  1. Changed Homepage
  2. WordPress Login
  3. Site goes Offline
  4. Redirection
  5. Web Browser Alerts
  6. Search Console Alerts
  7. Search Results
  8. Unknown Links Added
  9. Popup Ads
  10. Unknown Users Accounts
  11. Emails
  12. Traffic Surge or Drop
  13. Changed Homepage

Hackers don’t usually make visible changes on the site. They do not want to come under the radar after getting access to the site. But if your homepage looks different than it supposed to be, it is a sure way sign that somebody else is customizing your site.

Hackers mostly leave a message on the Homepage announcing that they hacked your site. Sometimes their intention is to ask for money. In that case, they leave their Bitcoin address too.

WordPress Login

Unable to log in to your site is a major giveaway that your WordPress site is hacked.

It means the hacker has deleted your admin account files. You would not be able to reset the password, as your account does not exist.

There are other ways to add your account to the WordPress site. A quality Hosting company could help you to get your account back on the secure WordPress site.

Site goes Offline

It is the job of your hosting provider to actively look for any hacking attempt directed at their client’s websites.

When their security sees any infected site, they turn it down. This way, they stop the infection from spreading to other websites or the central server.

Redirection

Hackers redirect the site to spammy sites. It could be an indication that someone has hacked your Domain name account. 

You have to check your Domain Name account and Hosting account to find the exact reason for the redirection.

Smart hackers redirect the sites only for the log-out users. Log-in users keep using the website as usual, without noticing that their website has already become a victim of hacking.

This redirection leads to a drop in the traffic of the site. 

Web Browser Alerts

Chrome or any browser shows a warning when a person tries to visit a webpage that hosts harmful content.

If any of your visitors find this message, your site likely has some malicious codes. In these cases, the hackers use sites as a host to gain information from visitors.

Warnings could be of different types:

Search Console Alerts

Google Search Console alerts the webmaster when there is something wrong with the security of their site.You can find this in the Security Issues tab.

Search Results

The search results show gibberish or Japanese Character. It means your site’s backend is not secure.

There are three kinds of hacks that are most popular:

  • Japanese Keywords Hack
  • Gibberish Hack
  • Cloaked Keywords and Links Hack

In these hacks, Hackers are using your website to host spammy content. 

You might see clear meta-descriptions and meta-titles, but Google Crawlers reads the content differently.

Check your Site Search results: site:yoursite.com

Unknown Links Added

If you noticed external links on your site, that you don’t remember adding, it means someone else is making links on your website.

There are Marketplaces where Hackers sells backlinks from Authority Sites. They get a handsome amount in return. Thus, making the hacking worth of time.

Mostly these links are sitewide, but not always.

Keep checking the outbound links of your site. If there are links on your site that leads to shady content, your site would also be in danger.

Popup Ads

Popup ads happen to your site when hackers want to earn money.

He injects all these spammy ads that lead to open the affiliate links of the hacker. Popup ads open in a new window, hence even the users don’t notice them.

They do not show these ads to regular visitors. Only the traffic that is coming from the search engine gets to interact with such advertisements.

Unknown Users Accounts

If you find an unknown users account in WordPress Users section, it might be possible that your WordPress site is hacked. 

You can quickly delete such spam accounts from the dashboard area.

Though, if the hacker is adding an account, he will surely give the admin role to that account. You will not be able to delete the Admin account from the WordPress admin area. 

You have to delete the unknown accounts from the c-panel.

Emails

Every good hosting service provides free email accounts. These email accounts are helpful for WordPress and business-related mails.

When hackers hack the sites, they use these emails to send a huge number of spam mails. Due to that, Spamhaus.org flag your mail account as spam.

You may not be able to notice at first that your site is hacked. You will just notice that Emails are not working anymore. So, if your Emails stopped working suddenly, you should run a security scan.

Traffic Surge or Drop

Sudden hike or drop in traffic is also an unusual activity. 

Browser Safety Alerts and Redirection can drop the traffic. The bot attack increases the traffic. In both cases, your site is not secure.

Excessive spam traffic can increase the load on servers as well. Actual visitors will find the site slow because the Bots would be eating the bandwidth.

That’s why keeping an eye on the site stats is essential.

Final Words

If you look for the signs we mentioned in the list, you will detect the hacking. You would be able to prevent any severe damage.

No tags for this post.

Related posts

You Need to Know What is erb in Rails and How to Master it

One of the first things that made me uncomfortable about learning Rails was taking a look at the views. I found some weird files with some weird syntax. Their name was something like ‘name.html.erb’ and I was like what? Why does this file which seems to be an HTML file with kind of HTML syntax has a .erb extension after the .html extension?

Those weird symbols which were somehow like HTML tags were a mystery for me at that point. So, I started investigating and this is what I found:

What is erb?

‘erb’ Refers to Embedded Ruby, which is a template engine in which Ruby language embeds into HTML. To make it clearer: Is the engine needed to be able to use the Ruby language with all its features inside HTML code. We’ll see its application in the next sections.

Rails use erb as its default engine to render views. It uses a specific implementation called erubi.

To use erb in Rails, files should all have a .html.erb extension for Rails to process them in its asset-pipeline.

How does it work?

During the course of my investigation I found that there are three elements used:

Expression tags <%= %>

This tag indicates that there will be an expression inside it. The main point: The application will render the result of executing the code inside this tag. Let’s check an example to make it clearer:

I assume you already understand the MVC architecture in Rails, but if you don’t, worry not. I’ll try to explain it as clear as I can:

Imagine we have a User model which has the following attributes:

The controller will ask the model to retrieve the first user from the database. Then it will store it in an instance variable called @user:

If we would like to display the first three user attributes in a view, we would use HTML and erb code like the following:

<h1>First User Information</h1>
<ol> <li>User id: <%= @user.id %></li> <li>User name: <%= @user.name %></li> <li>User email: <%= @user.email %></li>
</ol>

And the browser will display the results like this:

Remember: The app will render the result of executing code inside expression tags. For example <%= 2 + 2 %> would render 4.

Execution tags <% %>

Like expression tags, they embed Ruby code inside HTML. The difference is that the application won’t render the result of executing the code. They are frequent on ruby expressions. Let’s review an example to understand it better:

Imagine you are building a view to display all the users in your database. The controller will ask the model to retrieve all the users from the database (10 total). Then it will store them inside an instance variable called @users. You have two ways to do this: The efficient one and the inefficient one. Let’s start with the later:

A very inefficient way to display a list of users would be building a view like the following one:

<h1>All Users</h1>
<ol> <li> <%= @users[0].name %> </li> <li> <%= @users[1].name %> </li> <li> <%= @users[2].name %> </li> <li> <%= @users[3].name %> </li> <li> <%= @users[4].name %> </li> <li> <%= @users[5].name %> </li> <li> <%= @users[6].name %> </li> <li> <%= @users[7].name %> </li> <li> <%= @users[8].name %> </li> <li> <%= @users[9].name %> </li>
</ol>

That view will render:

Imagine there are not 10 users but thousands of them. That’s when a loop becomes useful, and we can use it with the help of execution tags and expression tags as shown below:

<h1>All Users</h1>
<ol> <% @users.each do |user| %> <li> <%= user.name + ' efficiently displayed' %> </li> <% end %>
</ol>

Note three important points:

  1. We use an ‘each’ loop to iterate over the users. That way our code is no longer dependent on how many users are there in the database.
  2. We use execution tags (<% %>) to wrap the block structure. That is the Ruby code the application won’t display.
  3. We use expression tags (<%= %>) to wrap the chunks of code the application will display. In this case, every user name + text ” efficiently displayed”.

Below you can check the results of the implementation:

Comment tags <%# %>

Used for placing Ruby comments inside your code. For example:

<h1>All Users</h1>
<ol> <%# loop through every user %> <% @users.each do |user| %> <li> <%= user.name + ' efficiently displayed' %> </li> <% end %>
</ol>

The code inside the comment tags gets ignored at the moment of building the template.

How to master it?

Let me tell you something great: Mastering erb will be a matter of minutes if you are already good with Ruby and HTML. Otherwise, It will be a matter of hours practicing until you get it.

If you plan to become a professional Rails developer, you will practice this skill a lot, so don’t worry. You will master it sooner or later.

An advice that always works is: Practice and experiment a lot, that’s the only way to master a tool.

Important: Avoid using logic heavy erb code in your views. For those cases, it is always better to create a helper method to deal with that.

You can check these sites too if you want to learn more about erb:

No tags for this post.

Related posts

3 Ingredients I Think You Should Give Up Forever

Life expectancy is going backward in both the U.S. and the U.K. This isn’t a blip like World War I plus the Spanish flu combo. This is a downward trend since 2015. When’s it going to stop? What’s happening to our quality of life?
For one thing, chronic disease is on the up and up. The World Health Organization estimated levels would increase 57% from 2001 to 2020. Previously, the increase in chronic disease was blamed on the aging population — the idea being the longer you live, the more time disease has to manifest. But now those ideas are diverging.

What may be to blame? How might you beat the odds of developing chronic disease?

We need all the help we can get in this honking, stressful, over-dopamined, keep-up-with-the-Joneses, ring ring, beep beep, oh Christ, how do I look in this photo? world we call the present.

As a registered nutritionist, I often explain to people that a good diet is as much about what you don’t eat as it is about what you do. In today’s developed world, we’re overfed yet undernourished. Junk has taken over.

This is largely because we concern ourselves with calories first and believe that by maintaining a certain amount below a given number, everything will be hunky-dory. And all the while, we miss out on the real reason we eat: to obtain nutrients.

If you make the effort to eat nourishing foods — unfashionable or a pain to prepare by today’s standards — the calories will take care of themselves. And, for the most part, so will your health.

If you’re the type who prefers to know what you need to stop doing (before you commit to a new start), I’ve got you covered. Here’s my list of the three empty foods you should throw away, leaving space for you to enjoy real food that was born or grown.

Highly refined grains are the backbone of many junk foods. By avoiding them, you will steer clear of many things that are best thrown away by default.

We live in a developed world of abundance. So much choice! We’re positively drowning in dietary decisions. So, why go for a highly refined grain like white bread or pasta when you can go for a more nutrient-dense option? It’s because you don’t like the latter, isn’t it?

Einkorn, rye, spelt, buckwheat, kamut, and other old-fashioned grains are better choices, made with sourdough rather than modern baker’s yeast. The fermentation breaks down the antinutrients, making the actual nutrients more easily assimilated and less likely to cause digestive stress. So, go on and try something new! Get out of that rut! The other option is to remove grains from your diet completely (after all, there’s no such thing as a grain deficiency).

I don’t particularly like grains. I think they emerged from a time of famine in brittle environments after we successfully wiped out most of the really large and tasty mammals symbiotic with the lush grasslands. Sahara! But that’s another story.

If you’re unwell, I would highly recommend you do this or skip cereals all together. Those people who are robustly healthy may well be getting away with it — for now. Sprouted bread is the only type I eat and, unlike the other types, is the only one that doesn’t make me feel unwell.

Sugar is added to almost everything we eat. You can’t even buy peanut butter that hasn’t been altered by the manufacturer so that you just can’t stop eating it.

Eating added sugar — often hidden within foods under one of over 68 different names — is an excellent way to become overweight and ill.
Not only does this add lots of energy to your diet that you don’t burn off — bed, train, office, train, sofa, bed, repeat — you’re also prodding your hormonal system multiple times per day. Nothing likes being prodded. Simply put, this combination of things will create extra adiposity (fat) and cause disease. Lots and lots of disease.

Avoid added sugars, and you avoid junk foods. Can you see a pattern emerging?

Extruded vegetable oils were brought into our lives in about 1910 when the processing of them became cheap. Make no mistake, these oils are a food technology that should never have been developed.
On the face of them, these oils seem like good value, but they have brought with them health risks that far outweigh any benefit. A study published in the British Medical Journal found that replacing saturated fats with vegetable oils increased death from “all causes” by 62% and from cardiovascular disease by 70% when measured against a control group. This study is strong enough to show causation by vegetable oils versus the “associations” seen in much nutrition research.
Sadly, vegetable oils are implicated in the development of cancer. A study of mice demonstrated double the malignancy in those given vegetable oil versus those given fish oil. Another rodent study demonstrated the normal and healthy process of cell death (apoptosis) was prevented by the ingestion of corn oil, and as a result, colon cancer developed.

To keep things simple, I have a rule of thumb: If the oil comes from something that is not obviously oily, then don’t eat it — ever.

“That large plastic container of vegetable oil next to your stove right now? Throw it away or use it on your bike chain!”

The major problem with these oils is that their precious, omega-6 polyunsaturated molecules are easily damaged in the extensive processing. They’re additionally vulnerable after bottling and reacting to oxygen and sunlight. This damage makes the molecules reactive and inflammatory.

Vegetable oils are the true wolf in sheep’s clothing — sold to us like they were doing us a favor when in fact, they’re stabbing us in the heart. This stuff isn’t food, plain and simple. Do not eat this crap. That large plastic container of vegetable oil next to your stove right now? Throw it away or use it on your bike chain!

If you combine these three things — highly refined grains, added sugars, and vegetable oils — you get something like a doughnut. A doughnut is all smiles, taste explosions, and pretty boxes, but I kid you not: Foods like this will shorten your life and the quality of it.

Stop waiting until Monday or next month or Lent or whenever else to enact these changes. Do it today. Stop snacking because that’s where these foods live for the most part. And if foods like this have taken over your main meals, you’re in real trouble.

Ask yourself what role these foods play in your life. Make sure you are eating nutrient-dense foods and making the time to reconnect with your kitchen because if you don’t, disease will make time for you.

Picture yourself in a few months. You’ve successfully removed this junk from your diet and you’re feeling amazing. Back into the old jeans you wore at university (too bad they’re boot cut), skin glowing, energy up, and your boss thinks you’re the best. Cha-ching!

No tags for this post.

Related posts

I’m No Longer an HTML Beginner. What’s Next? SEO Education.

I have had opportunities to participate and even organise programming bootcamps. Some of them had the purpose of just introducing basic knowledge of a given stack to beginners, whereas others were somehow advanced.

In this article, I am going to talk about best practices of enhancing Search Engines Optimisation(SEO) in HTML pages as a “BRIDGE” from being novice to expert.

Why does SEO matters?

For many websites, being ranked on the first page of Google search results is the best thing that can ever happen to them.

Imagine when you own an online platform that teaches how to cook. What can you do to stand out among 14,900,000 million results, therefore, be on the first page. People believe that the real of the content they want is only on the first page? One of the major purposes of your website is to bring visitors. But, remember, a big traffic comes from search engines such as Google, Yahoo and Bing.

The contents that appear on the 250th page of search engines result won’t serve anything for your business, right?

What is Search Engine Optimisation?

Search Engine Optimisation is an array of practices of increasing the website’s visibility in search engines and high change to be easily found. Most of those practices are done in HTML as a better way of facilitating what the website is about and allows it to show the content to the right people.

The power of HTML Tags in Search Engines

Don’t underestimate “tags”. Tags tell search engines how to read the content of the website and classify them depending on their types. In fact, you can quintessentially improve search engine visibility by simply adding the right tags in HTML.

When a search engine’s bot crawls a website, it analyses every single HTML tag. Therefore determines what contents are about.

Top HTML tags you shouldn’t leave aside

1. <title>

In the <head> tag, you should have <title> because it helps to determine the subject of a particular page.

<title>How to prepare italian sausage</title>

2. <meta name=”description”>

Meta description is used to determine a description of the web page

<meta name="description" content="Know how to cook sausages? So they're not weird and maybe raw in the middle? No? This two-step, simmer-then-sear method is for you.">

There is a couple of meta tags you should have in the <head> tag such as Canonical tag, robots tag, open graph tags, etc.

3. Header tags(h1 to h6)

Header tags are very important in the structure of web content. They are not mandatory when designing web pages and it is okay. But, If you consider to avoid discouraging search engine’s bot to crawl your website, you will need to have them. These tags play a vital role in the classification of the content in an efficient way. The order of your header tags reveals the level of value and dominance of content in sections.

<h1></h1>: This is reserved for the title of the web page. Usually it should be only “one” on the entire web page

<h2></h2>: H2 highlights the topic of the title.

<h3></h3>: H3 reflects points in regards to the topic and should support sub contents of h2.

<h4></h4>: H4 supports sub contents of <h3>.

<h5></h5>: H5 is not often used and supports sub contents of <h4>.

<h6></h6>: H6 is not often used too and supports sub contents of <h5>.

4. The “alt” attribute in images

One of the most effective ways of engaging the audience is to use images in the web pages. According to Medium.com, when a persons is reading an article, he spends much time viewing images than just reading text. Therefore, it shows how audience is highly interested in viewing photos. However, adding images in your article won’t be easy for bots. It is very difficult for search engine’s crawlers to determine which type of content in the image with just loading it. “Alt” attribute tells search engines the picture is for a given topic.

Without the “ALT” attribute, search engines would not know what the image is about. Therefore, this could reduce indexing the images in search engine results.

<img src="./path/image.jpg" alt="Italian sausage recipe">

Recommendation

My journey to becoming familiar with HTML&CSS wasn’t much tough. However, I did not know what to do next after designing fancy pages. I recommend all beginners to also think about how to facilitate search engines optimization in their deliverables.

Conclusion

In this article, I covered a few best practices of increasing search engine optimisation, feel free to add/suggest anything in the comment box.

Thanks for reading.

#Gracias!

No tags for this post.

Related posts

How 350+ PhDs and AI Researchers are Banding Together to Fight the Corona Outbreak

Disclaimer

This is a story about how I’ve published a cry for help on my Linkedin less than a week ago and somehow found myself coordinating a group of 350 PhDs and AI specialists that are working 24/7 without any compensation to help humanity fight coronavirus.

If this doesn’t fascinate you as a phenomenon, then please stop reading and get back to watching Netflix. This is a story about how 350 people are refusing to watch TV and scroll Facebook feeds, they are making a real difference and impact, right now.

It’s a story of how global community is coming together and helping spread the word, be it an investor from a top VC fund

or a person that got laid off and has nothing to do:

So… why are we doing this?

We are all in this together and we are living through unique time when every single country is equal, when every single layer of society is susceptible to the common enemy. Yes, Tom Hanks is as equal as a middle-eastern dictator, Justin Trudeau is as equal as Trump and yes, you are as equal as 1% of the wealthiest.

Suddenly nationalism doesn’t matter, doesn’t matter if you support capitalism or socialism, if you are democrat or republican, if you are a legal or illegal immigrant, it just doesn’t matter.

Both the epidemic itself and the resulting economic crisis are global problems. They can be solved effectively only by global co-operation. First and foremost, in order to defeat the virus we need to share information globally. That’s the big advantage of humans over viruses. © Yuval Harrari

Though some people are still skeptical and are strong believers in the conspiracy, comparing it to ordinary flu by the numbers of deaths and say that people were dying from many other diseases and no one ever cared.

And yes, people were dying from many different diseases every day and it didn’t seem like everyone was dropping what they were doing and was immediately jumping into research to help with diabetes/HIV/etc.

(footnote: why didn’t we care? why shouldn’t we care??)

So what’s different now? It’s quite simple and I don’t need any flatten curve graphs to explain this. Even though everyone seems to love those graphs and pretends that’s a cure/silver bullet to solve the current crisis. For those believers, here’s a not so controversial advice, it’s not a solution.

(for fans of “Idiocracy” movie)

It’s an attempt of our brains to keep us sane and pretend we are safe assuming everyone is a responsible, diligent and disciplined individual that understands highly complex long terms effects and 2nd/3rd order consequences of every single action that is taken (thank you, Ray Dalio).

See? No one cares about things like virality, it’s too complex and harder to understand/model than two basic curves that literally scream at you

“we know how to solve this, it’s actually very easy, see the graph, don’t you see it? it actually moves and… it’s got some green electrolytes!”

Even google doesn’t care about virality, it actually explains what virality is in terms of internet memes… c’mon guys?

And it’s really all about virality. And also novelty, complexity and lack of our ability to explain this emerging common enemy. There is literally ZERO understanding. Everything that you hear in the news is an educated guess combined with statistically irrelevant data.

And you know, I don’t actually blame governments, this is a product of the broken system that we are observing as we sit at home quarantining ourselves to boredom. And god bless the US government because it actually acknowledges it and is really trying to fix this mess.

And that’s exactly why WhiteHouse in cooperation with Allen Institute of AI (backed by Microsoft co-founder) decided to reach out to us. Not me personally, but to the whole world of AI researchers like me, to seek help and get that precious brainpower of ours.

What happened next is history. I can’t explain it, I can’t model it and I certainly wasn’t ready for it. Still not ready as I’m carving out a few mins out of another 20h workday to finish this article.

Timeline to impress you with some numbers…

Day 1: I talked to 10 people, had 15 calls.

Discovered covid-19 Kaggle challenge and my main learning was that there are so many hardcore machine learning engineers that are willing to help but they have no clue what to do. I decided to help shape that structure and bridge the gap between general questions like “What are the risk factors of covid-19” and technical formulation of the machine learning problem.

Day 2: I talked to 30 people, had 40+ calls.

Realized I can’t do it anymore and I’m not the smartest person to help. Decided to create a trello board and slack and actually project manage this impressive chaos.

Day 3: I talked to 60 people, stopped counting calls.

Realized that the core focus should be something interactive and something that would produce visual results. We’ve decided to score 10 existing tasks to focus on the top 3 based on our subjective reasoning and immediate impact.

Day 4: the group had 150 members.

Something crazy started happening, more self-organization started to emerge, some non-technical people started helping me manage others, we’ve started realizing that we have some new form of organization that is actually getting stuff done, quite effectively. Someone would formulate a task, go to bed and wake up to a completed task by some random stranger from Australia, how much more powerful can this become?

Day 5: the group had 200 members

We bought the domain and quickly mocked up a website to expose ourselves as a group, not just a random collective of couple hundred geeks. Who knew all possible domains including “corona” are all taken by domain hoarders…

Day 6: the group had 250 members

We’ve prepared some external communication pieces because people started wondering what exactly are we and what’s the point. As you can imagine not the easiest task when most of the team are highly intelligent but super introverted individuals. But we made it happen, check it out.

Day 7: we are now 350 members strong.

Some structure is emerging, we are tackling 4 tasks that we’ve identified as most feasible to help with and we now have individual teams working on those. We now have communications team that is integrating medical experts, we are working with both independent researchers and organizations, and we are now the largest independent organization with the strongest pool of highly technical and highly motivated talent that is tackling COVID-19 data analysis.

Wait, what’s going on? is this a cult? don’t you have a job?

It definitely feels like a cult, how else do you explain people spending so much time together solving things they have no clue about and trying to think beyond their typical 9–5 responsibility. Ignoring families, ignoring their kids, not sleeping and grinding through unstructured, highly complex and often depressing data on statistics of mortality…

But here’s a thing, we are believers, we are those crazy ones and we get things done, much faster than any other company or team. People from NASA/Intel/Amazon are joining us and sometimes I question myself if this is all a weird dream of mine or something? but it’s real and it seems like the most intelligent and proactive people are self-organizing around us.

People are even asking us to give them materials to request dedicated volunteering time from their organization. Honestly, we have no clue how to do that yet, but we will.

And of course, nothing is perfect. There is a lot of frustration, with the lack of structure and process, the inability of other companies/organizations to move as fast as we need to and simply because no one has ever built such a fast moving structure that is fully distributed across the globe, across hundreds of different disciplines, professional and cultural backgrounds. And the one that is solving extremely complex problems in a short timeframe. Priorities, organizational issues, timezone management, communication protocols… It’s taking us a week to ask for computational resources from Amazon/Google and other giants and they are trying their best but are unable to move as fast (pls help us here if you can).

– Is it just for AI geeks?

Not really. We’ve got people of diverse backgrounds and skillsets. And we need more, way MORE.

– Is it just for geeks in general then?

Not really. We’ve got professors of nuclear physics (pretty cool huh).

We’ve got psycholinguists?!

We’ve got impact investors…

We’ve got all kinds of PhDs, you name it!

– But do you have any medical experts or are you just coding?

– So are you saying anyone can help?

That’s exactly how it is. And even more importantly, we are currently severely understaffed with non-technical talent. We are literally 3–6 people that manage 300+ full time, as you can imagine we barely sleep and have any rest, but we are feeding off the energy and motivation keeps coming.

– Soo… you didn’t answer the main question, don’t you have a job?

Yes, we all have jobs. For now. The right question to ask is if YOU will have a job soon. Because the current economic crisis suggests we won’t.

No matter how skilled you are, no matter which field you are working in. It’s no longer just restaurant workers, uber drivers or cruise ship employees. It’s all of us and everyone is affected.

And if you haven’t been affected by the wave of layoffs, you certainly have a friend who has been. It sucks.

anonymous data from Blind app

It all started 2 weeks ago when entrepreneurs and business owners entered the cash preservation stage in attempts to save their businesses. And it all spiraled. The wheels of economic machine started to spin in a negative direction. Unfortunately, many were not prepared at all.

anonymous data from various entrepreneur communities supporting each other in tough times

And this will keep spiraling until all of us will realize, there is no magical trillion dollar package that will save us, you can’t flatten that curve, this money has to come from somewhere…

That’s why we are doing it, because we realize the consequences of passively watching the world falling apart. If anything, it’s a perfect time to rebuild it and make things right.

A new world, new opportunities & new models

Even though it’s going to be tough for a bit, we are seeing something amazing emerging out of this crisis. Self-organization is taken to a completely new level that we’ve never seen before. Some are even comparing it to the war times when researchers like Alan Turing were working 24/7 to crack the codes for intelligence purposes.

It took us less than a week to go from 10 to 300 people. And similar things are happening everywhere across the world. If someone would ask me how much it would take to employ such talent a few months ago, I would say around $100m and next to an impossible chance of gathering such a team even in months of active recruiting. That’s a pretty big number, but does it match up to the potential impact? It doesn’t, if you compare it to $2 trillion of damage the US government alone is about to take on…

2,000,000,000,000 dollars > 100,000,000 dollars

What’s next?

Now that you know what we are doing, you may be wondering how can you help us if we are so smart and have so many people already? and what’s the point of this article at all?

The truth is — we do need YOUR help.

Here are the tasks we are currently working on:

– Help us understand how geography affects virality.

– What do we know about COVID-19 risk factors?

– What is known about transmission, incubation, and environmental stability?

– What do we know about vaccines and therapeutics?

Here are the things we need help with:

CoronaWhy call to action document here

Here’s a snippet of what you will experience:

Also, what’s in it for you?

– Get practical experience in a fully global distributed remote team (quite an important skill that is very relevant in the coming era)

– Learn how AI/ML is helping in solving real world scenarios, not detecting cats and dogs

– Solve a billion dollar problem, or a $2 trillion one

– Get volunteer experience on your resume, once a lifetime experience on saving humanity from a deadly virus

Save lives! Save your family, friends and loved ones!

And if you feel like this article deserves to be read — please help us and share it with your friends on other platforms. That’s a rare moment when AI guys are asking you to help with the same algorithms they are usually creating (what an irony), but that’s a story for another article!

Let’s go!

<iframe src="https://giphy.com/embed/vDZACy278sqT6" width="480" height="260" frameBorder="0" class="giphy-embed" allowFullScreen></iframe><p><a href="https://giphy.com/gifs/page-idiocracy-vDZACy278sqT6">via GIPHY</a></p>

PS: Please do share this article, even if there is 0.00001% chance anyone will see it, please do. The impact of these 10 seconds of your life will be paid back in all the sleepless nights research community can save.

PPS: We got posted on ProductHunt, please jump in to support us.

No tags for this post.

Related posts

How Demand Side Platforms (DSP) Can Help Recalibrate Ad Businesses in 2020

White label DSP platform gives advertisers a chance to open independent ad tech businesses, set their own conditions and influence the outcomes of media-buying directly, without third-party providers. When advertisers go autonomous and what white-label gives them, let’s take a look.

If more than 42% of budding startups die within the first year because they find no demand for their product on the market, with programmatic business, everything is slightly different. Programmatic advertising has a reputation of a very profitable business — in 2019, the growing share of algorithmic (programmatic) ad spend outweighed all other kinds of media-buying (84% of total digital display ad spend).
According to the eMarketer, in 2020, programmatic ad spend will generate $57 billion, and by the end of 2021, ad spend is supposed to grow to an unpreceded $79.75. This means only one thing: programmatic continues to redraw the digital advertising ecosystem.
Many years ago, Netflix and Kellogg built their own programmatic platform as a way to save a good share of their media costs. Last year, Buyer reported about a complete shift to in-house programmatic that was supposed to make their media purchases crystal-clear and transparent.

If you are a novice in programmatic advertising, chances are you think in-house programmatic is for big businesses like these. However, brands that have tried white-label DSP prove the opposite — this new approach helps to build your own programmatic platform in-house without substantial financial injections and risks.

So, if you’re interested in how to enter Ad Tech business, let’s make a profound analysis of some core things:

●      What is a DSP?

●      What is a White Label Technology?

●      How it is different from White Lable DSP?

●      How to find out whether you need to step on independent grounds of programmatic?

What is a DSP and why advertisers use it: 3 most important functions

Demand-side platforms automate the process of online media buying and this is their most important function. Technically speaking, it is a software platform that features connections to SSPs, ad networks and ad exchanges and enables advertisers to purchase inventory programmatically, usually via RTB (real-time bidding auction) as the picture below illustrates. 

DSP doesn’t shoot blindly to whatever bid comes its way; it needs a definite set of rules to act upon and advertisers provide the platform with such directions in the user interface. Targeting (OS, gadget, age, gender, zip, channel and more), frequency, budget limitations, filters, etc., — all these criteria take part in the decision-making process.

They enable delivering ad messages to the right audience at the right time and device. They also limit the number of impressions per user to prevent ad fatigue and budget overspending.

Strengths of DSP solution that can’t be found in manual media buying

●      Buying activities automation. As mentioned above, the platform uses the rules defined by the advertiser to automate the purchasing process. Responsible technologies: AI and ML.

●      Cost-efficiency and speed. With automatic auctioning, bid approval and ad serving, there’s no need to negotiate the placements, make credit card transactions for every placement or arrange email correspondence. As soon as the user with relevant cookies opens the page (app), DSP defines the potential worth of the bid and joins the auction.

●      Real-time optimization. Negotiated placement leaves little maneuvering for advertisers to change something about their ad, even if it doesn’t bring a return on investment — the ad is paid for and it’s placed somewhere for a definitive period. DSP platform makes real-time analytics. If something isn’t working, there’s always a chance to change the campaign flow in real-time.

●      Only one person to operate. With DSP, advertiser gathers all campaigns in one place – on one DSP dashboard. Unlike manual placement, the advertiser doesn’t have to attract sales reps or agencies. Everything can be done by one person (self-serve DSP) or a team of DSP specialists that assist managed-serve campaigns (sometimes via ATDs).

What is a white-label technology?

White label concept is a type of partnership where one company produces goods, services or technology and another one purchases it and re-sells it under their own brand.

This way, each partner can focus on their core competencies (production or sale) and thereby save substantial costs, essential for new business launch.

What is a white-label DSP platform?

White label DSP platform enables taking this very same DSP technology in house, which means enterprises can purchase prebuilt DSP software, customize it and label it as their own solution.

With white label DSP, the purchaser of the platform can operate as:

  • A white label DSP reseller: sell DSP services for brands on a commission basis
  • A media-buyer: optimize their own media buying costs
  • An agency: streamline work for clients or provide them with self-serve accounts

Where white-label outruns self-development?

Like any programmatic solution, the demand-side platform is a complex thing that requires building the software system and plugging it all in the vast programmatic infrastructure. RTB, ad server connection and partner integrations are not even making a fraction on the full list of must-dos.

When building a DSP platform from scratch, business owners need to attract the full specter of professionals in many areas of expertise: ad tech professionals, software developers, project managers and q/a testers. Then, they have to wait until the product roadmap slowly turns into a robust, working and market-licensed platform. This can take months, half a year or more (depending on your team).

White-label DSP solution is a pre-built, SaaS-based technology that is already fully hosted, maintained, built with RTB integrations and only requires minimal customization.

This alternative business building approach removes all the burdens of building the platform from scratch: developing, designing, testing, licensing and hiring people to scale it up when the business expands.

Costs. ‘Building your own DSP solution’ is a phrase that already sounds costly. However, if building a platform from scratch may take somewhere between $40,000 and $250,000 (again, heavily depends on your team), the white-label model will make it happen for  $1 000 to $50 000 (pre-made technologies are known for their affordability).

Why do companies look for white label DSP platforms?

White label DSP platforms are gaining momentum because more companies are striving to take their programmatic purchases in-house. By the end of 2019, 38% of advertisers, 66% of agencies and 65% of publishers in Europe had taken their programmatic purchasing in-house.

Why are they doing this? Even though programmatic allows agencies to simplify and automate media-purchasing as much as possible, it still experiences challenges associated with insufficient transparency, also known as a ‘black box problem.’

Some DSPs don’t disclose terms of cooperation, so the agencies can’t deliver to customers relevant information concerning service commissions and margins that DSP charged them with.

As a result, brands see only the total cost of services and are unable to distinguish what they are paying for. By selecting an in-house strategy, 50% of advertisers expect to increase operational control, reduce media costs (42%) and gain greater transparency on where campaigns run (33%).

Do you need to set up white-label DSP?

White label DSP solution enables a simple shift to in-house programmatic media-buying. While self-serve DSP may perfectly satisfy the needs of a small brand, in some cases, business needs may overgrow the basic abilities of DSP. The following examples can perfectly illustrate this point:

Saving money. Exactly, for this reason, buyer brought programmatic in-house and saved approximately $10M during the first 6 weeks. Sure, if your regular media-spend is small, there’s no point to worry about DSP service commissions — normally they won’t exceed 30% per bid.

For instance, if you spend $1000 per month, it will barely reach $300. However, if regular media spend of yours climbs up to e.g. ~$29,000, you will leave $8,700 on the table.

What is a white-label DSP in terms of payment — it is a subscription-based platform that only makes the fee of the generated ad spend. So, if the ad spend is ~$29,000, with a normal 5% fee per month, you give out $1,450 instead of $8,700.

Scaling and avoiding risks. Building your platform with a white label DSP solution enables you to set up a functioning platform in less than a month. Most importantly, it helps to overcome one fundamental business building risks: as the platform is already time-proved and finished, this is a guarantee that technology will be bug-free and robust enough to measure up with big and reputable programmatic platforms.

For some companies, a white-label solution is a chance to join an RTB ecosystem effortlessly, and for others, it is also a chance to scale their platform anytime. The solution is based on the remote servers, so updates, upgrades and improvements will be happening on the background — there’s no need to terminate operation in order to scale capacities.

Ensuring safety and consistency. Customer data protection and advertising fraud are the weak spots of the advertising ecosystem. Brands that accumulate and use first-party data for targeting at times fear that it will flow through a third-party supplier to competitors.
The world has known enough examples of data misuse — Cambridge Analytica scandal is only the loudest of them. For this reason, advertisers strive to independently control user data and quality of traffic (applying scanners and monitoring statistics in real-time).

In order to extend potential reach, advertisers are often pushed to work with several DSPs, which creates another roadblock to effective targeting — bidding against your own bids in different auctions. White label DSP, in this situation, is the best shield against competing against yourself as you track each activity of the system.

Additional strengths a white label DSP solution may have:

  • Ability to connect existing SSP partners via API and obtain their user data
  • Ability to order on-demand features, integration and customization
  • Ability to support the widest range of ad formats
  • Ability to modify the system and integrate external modules via API

Things you should know before building white-label DSP platform

White label DSP platforms are solving a lot of problems that non-proprietary ad tech solutions can’t. Still, can we expect that the advertising industry will completely transition to in-house programmatic in the years to come? On the one hand, big players are massively heading in-house; on the other hand, the number of brands that open their own programmatic platforms and departments is relatively small and there’s nothing wrong about it.

1) Your budget. The reasoning of enormous budgets behind opening your own programmatic platform is nothing more than a common misconception. As was illustrated on the media-budgets example above, if a company doesn’t spend a lot of money on media, buys inventory rarely or seasonally, there is no need to buy advertising platforms and run independent advertising businesses. Why would you buy a ship if you only sail once per year?

Plus, there will always be players who follow the model of key competencies and who will not try to embrace innovative technologies or business models just for the sake of it. Before you adopt any technology, you should always ask yourself what problem it is supposed to solve; only then will it bring you profits and desired return on investment.

2) Where to get expertise. Not every advertising agency or enterprise has sufficient experience in the programmatic business. Sure, white label DSP providers will build the platforms from the ground up for you, but the promotion, market positioning and operation as a whole will be entirely on you. Your platform is your business — no one dictates you or patronizes you on what payment model to use, what partners to befriend or where to put the label in the dashboard.

In order to effectively cope with operational tasks and optimize ad campaigns on behalf of their clients (the success of which depends on so many conditions), it is necessary to have programmatic professionals in staff, sometimes entire departments. For this reason, even with the growing popularity of in-house, many companies will still rely on agency experience. The others, meanwhile, select the strategy where the DSP operation process is fully or partially outsourced.

The takeaway

DSP is a complex technological platform that streamlines and effectively automates media-buying. However, construction and launch of it can be very expensive if you build it from scratch.

In addition to costs, you need to invest in development, testing, and platform customization, building your own infrastructure can take the time that otherwise could be spent on conquering a new programmatic niche.

White label concept is an amazing choice if you keep it in mind the idea to start you own business in ad tech and plan to create a branded DSP technology of your own.

For many enterprises and agencies, the construction of a DSP is impossible from a financial standpoint. However, if you spend a lot of your budget on media-buying, then the decision to get your own programmatic solution may look pretty appealing.

White label DSP solves the problems of enhanced fees, insufficient transparency and accountability of the media-buying. It cannot be ruled out that in ten years, technologies like white label will make the programmatic businesses a commonplace for every company.

However, in making such an important decision, every entrepreneur must also give oneself answers to the questions: “What gaps am I covering?”, “Who will be in charge of operations?” and finally “Am I ready to run my own advertising business?”

(Disclaimer: The author is the CMO at SmartyAds)

No tags for this post.

Related posts

Building a Twitter Bot to Automate Posting Cryptocurrency News with Python [A Step-by-Step Guide]

Photo by Yucel Moran on Unsplash

Imagine your own custom Twitter Bot posting real-time market updates for your favorite cryptocurrencies. It is possible with Python, Tweepy, and LunarCRUSH. Below we will outline a step-by-step process utilizing Python. The bot pulls information from LunarCRUSH. The metrics for Bitcoin, Ethereum, Litecoin stored as formatted variables and tweeted out automatically every 30 minutes.

Setting up a Twitter Development Account and Tweepy

Let’s start off by installing Tweepy. It’s simple enough through terminal using the pip command:
pip install tweepy
Twitter for Developers provides access to the Twitter API in order to Publish and analyze Tweets, optimize ads, and create unique customer experiences. Check out the Twitter API documentation here. You can perform multiple tasks through this API. See below some of them:

Post and retrieve tweetsFollow and unfollow usersPost direct messages

Before you are able to use the Twitter API endpoints, create a developer account and generate your API keys. You can apply for a developer account directly here. You must answer questions on how you plan to use the API and accept the Twitter Developer Agreement, and then you will be granted access to the Developer Dashboard.
Once you are approved access to the Developers for Twitter, log in to the developer site and create your App. This step will automatically generate your consumer API keys and access tokens, remember, you should keep them secret:

The developer account should be linked to the Twitter account where you want to have the bot active. From the Twitter Development platform, you are able to edit the app permissions. In my example, I have granted my app permission to read, write and send direct messages.

Introduction to LunarCRUSH — Social Listening For Crypto

In order to start pulling in cryptocurrency social insights, you must head over to LunarCRUSH.com and set up an account.

In the left-hand navigation, head to Developers and select API Docs.

Generate V2 API Key

Once you are in the API Docs, generate your V2 API Key.

Building a Twitter bot with Python, Tweepy, LunarCRUSH

Let’s start building our Twitter bot. As mentioned previously, you will use the Tweepy library which works seamlessly with Twitter API and LunarCRUSH API / LunarSTREAM™.

First, import tweepy. Tweepy makes it easier to authenticate to the API through our Twitter App secret keys.

Below is an extract of code, do this by creating a OAuthHandler instance to handle our login bypassing the consumer key and consumer secret as arguments.

In order to be able to make requests to the API, send back an access token. Use the auth.set_access_token method to store the access request token for our session.

Now, you are ready to control our Twitter Account with Python. Note that I have included ‘XXX’ instead of my real secret keys. Replace ‘XXX’ with your secret keys that you can obtain in your Twitter Developers Dashboard.
import tweepy
import urllib.request
import ssl
import json
import time
ssl._create_default_https_context = ssl._create_unverified_context
# Oauth keys
consumer_key ="XXX"
consumer_secret ="XXX"
access_token ="XXX"
access_token_secret ="XXX"
# Authentication with Twitter
auth = tweepy.OAuthHandler(consumer_key, consumer_secret)
auth.set_access_token(access_token, access_token_secret)
api = tweepy.API(auth)

Our variable api is where you store the auth settings. You will use it to make requests to the Twitter API.

The idea of this twitter bot is to publish a different tweet, every x amount of minutes with specific cryptocurrency coin / token metrics. This can be easily done using LunarCRUSH API & LUNARSTREAM™.

Let’s add our LunarCRUSH API Keys to the code. Simply add:

api_key = "XXX"

Now you are authenticated with LunarCRUSH API. It is time to decide which Cryptocurrencies you would like to integrate into your tweets. Use coin_list to store the different crypto symbols. For instance, ‘LTC’ is Litecoin, ‘ETH’ is Ethereum, and ‘BTC’ is Bitcoin.

# Allows adding as many coins as desired
coin_list = [ "LTC", "ETH", "BTC"
]
coins = ','.join(coin_list)

A list of the fields desired from the API — key is the LunarCRUSH key, and the value is the field name outputted to Twitter.

{"LUNAR_CRUSH_KEY": "RENDERED_NAME"}

For example, to add tweet_replies:

{"tweet_replies": "Tweet Replies: "},

you would add this to the list below.

List comprehensions provide a concise way to create lists. Common applications are to make new lists where each element is the result of some operations applied to each member of another sequence or iterable, or to create a subsequence of those elements that satisfy a certain condition. — Python Data Structures Docs

Now you can map which values you would like to pull from LunarCRUSH API. As the program becomes more complex, this should be written in a more robust manner.

map = [ {"name":""}, {"symbol": ""}, {"price": " Price: "}, {"percent_change_24h": " - 24 Hour Percent Change: "}, {"market_cap": " Market Cap: "}, {"volume_24h": " 24 Hour Volume: "}, {"url_shares": " URL Shares: "}, {"reddit_posts": " Reddit Posts: "}, {"tweets": " Tweets: "}, {"galaxy_score": " Galaxy Score: "}, {"volatility": " Volatility: "}, {"social_volume": " Social Volume: "}, {"news": " News: "}, {"close": " Close: "},
]
def final_render(asset_tweet, value, key, asset): if key == 'symbol': asset_tweet += " (" + asset[key] + ")" elif key == 'percent_change_24h': asset_tweet += value + str(asset[key]) + "%" else: asset_tweet += value + str(asset[key]) return asset_tweet

Now, iterate over each of the fields from LunarCRUSH which gets the value from LunarCRUSH and renders it with the field name.

def main():
url = "https://api.lunarcrush.com/v2?data=assets&key=" + api_key + "&symbol=" + coins assets = json.loads(urllib.request.urlopen(url).read())
for asset in assets['data']: asset_tweet = "" for field in map: key = list(field.keys())[0] value = list(field.values())[0] asset_tweet = final_render(asset_tweet, value, key, asset) print(asset_tweet) print(len(asset_tweet)) # Posts tweets api.update_status(status=asset_tweet)
# Runs main() every 30 minutes
while True: main() time.sleep(1800)

Complete Python Code

import urllib.request
import ssl
import json
import time
import tweepy
ssl._create_default_https_context = ssl._create_unverified_context
# Oauth keys
consumer_key ="XXX"
consumer_secret ="XXX"
access_token ="XXX"
access_token_secret ="XXX"
# Authentication with Twitter
auth = tweepy.OAuthHandler(consumer_key, consumer_secret)
auth.set_access_token(access_token, access_token_secret)
api = tweepy.API(auth)
#LunarCRUSH API Key
api_key = "XXX"
# Allows adding as many coins as desired
coin_list = [ "LTC", "ETH", "BTC"
]
coins = ','.join(coin_list)
# A list of the fields desired from the API - key is the Lunar Crush key, and the value is the field name outputted to Twitter
# {"LUNAR_CRUSH_KEY": "RENDERED_NAME"}
# For example, to add tweet_replies, you would add:
# {"tweet_replies": "Tweet Replies: "},
# to the list below.
map = [ {"name":""}, {"symbol": ""}, {"price": " Price: "}, {"percent_change_24h": " - 24 Hour Percent Change: "}, {"market_cap": " Market Cap: "}, {"volume_24h": " 24 Hour Volume: "}, {"url_shares": " URL Shares: "}, {"reddit_posts": " Reddit Posts: "}, {"tweets": " Tweets: "}, {"galaxy_score": " Galaxy Score: "}, {"volatility": " Volatility: "}, {"social_volume": " Social Volume: "}, {"news": " News: "}, {"close": " Close: "},
]
def final_render(asset_tweet, value, key, asset): # As the program becomes more complex, this should be written in a more robust manner if key == 'symbol': asset_tweet += " (" + asset[key] + ")" elif key == 'percent_change_24h': asset_tweet += value + str(asset[key]) + "%" else: asset_tweet += value + str(asset[key]) return asset_tweet
# Iterates over each of the fields from Lunar Crush, gets the value from Lunar Crush and renders it with the field name
def main():
url = "https://api.lunarcrush.com/v2?data=assets&key=" + api_key + "&symbol=" + coins assets = json.loads(urllib.request.urlopen(url).read())
for asset in assets['data']: asset_tweet = "" for field in map: key = list(field.keys())[0] value = list(field.values())[0] asset_tweet = final_render(asset_tweet, value, key, asset) print(asset_tweet) print(len(asset_tweet)) # Posts tweets api.update_status(status=asset_tweet)
# Runs main() every 30 minutes
while True: main() time.sleep(1800)

Example Crypto Twitter Bot Tweet

Additional Functionalities to Include in a LunarCRUSH + Python Twitter Bot

Not only can you post tweets; our LunarCRUSH Python Tweepy Twitter Bot can perform additional functionalities.

For example:

Pull information about a particular Twitter user — User methods

Configuration Options

It is possible to configure the bot to extract additional features. For example:

?key={API_KEY_HERE} - Required to render the widgets.
?symbol=BTC - Change the symbol that is displayed in the widgets.
?interval=1 Week - Change the time interval being displayed in the charts (default is 1 Week).
?price_correlation=true|false - Show a price line in addition to the selected metric (default = false)
?metric=galaxy_score - Change the timeseries metric being displayed (Metric widget only).
?animation=true|false - Show or hide component animations (default = true)
?theme={See themes section for instructions}
?scrolling=true|false (default = true) - Enable or disable scrolling on the widget inner content. Use this if you want to set scrolling=false on the iframe with a fixed height but still want to allow scrolling within the widget.

You have the ability to configure and add all available metrics from LunarCRUSH:

market_cap (Market Cap)
galaxy_score (Galaxy Score)
price_score (Price Score)
average_sentiment (Average Sentiment)
social_impact_score (Social Impact Score)
market_cap (Market Cap)
galaxy_score (Galaxy Score)
price_score (Price Score)
average_sentiment (Average Sentiment)
social_impact_score (Social Impact Score)
correlation_rank (Correlation Rank)
volatility (Volatility)
social_score (Social Volume)
social_volume (Social Volume)
twitter_volume (Twitter Volume)
reddit_volume (Reddit Volume)
news_volume (News Volume)
search_volume (Search Volume)
spam_volume (Spam Volume)
bullish_sentiment (Bullish Sentiment)
bearish_sentiment (Bearish Sentiment) Metrics Widgets
average_sentiment (Average Sentiment)
correlation_rank (Correlation Rank)
galaxy_score (Galaxy Score)
market_cap (Market Cap)
market_cap_rank (Market Cap Rank)
news_articles (News Volume)
popular_tweet (Popular Tweets)
price_btc (Price BTC)
price_score (Price Score)
priceclose (Price Close)
pricehigh (Price High)
pricelow (Price Low)
priceopen (Price Open)
reddit_comment (Reddit Comments)
reddit_post (Reddit Posts)
reddit_post_reddit_comment (Reddit Volume)
search_average (Search Volume)
social_impact_score (Social Impact Score)
social_score (Social Volume)
tweet (Twitter Volume)
tweet_sentiment1 (Very Bearish Sentiment)
tweet_sentiment2 (Bearish Sentiment)
tweet_sentiment2_tweet_sentiment (Negative Sentiment)
tweet_sentiment3 (Neutral Sentiment)
tweet_sentiment4 (Bullish Sentiment)
tweet_sentiment5 (Very Bullish Sentiment)
tweet_sentiment4_sentiment5 (Positive Sentiment)
tweet_sentiment_impact1 (Very Bearish Sentiment Impact)
tweet_sentiment_impact2 (Bearish Sentiment Impact)
tweet_sentiment_impact3 (Neutral Sentiment Impact)
tweet_sentiment_impact4 (Bullish Sentiment Impact)
tweet_sentiment_impact5 (Very Bullish Sentiment Impact)
tweet_spam (Spam Volume)
volatility (Volatility)
volumefrom (Market Volume Open)
volumeto (Market Volume Close)

Final Thoughts

Within a few lines of code, your easily configurable Twitter bot now pulls data from LunarCRUSH and automatically engages your audience with real-time reliable social insights.

There are a few things that can be done to improve the code, such as additional LunarCRUSH parameters to tweet robust cryptocurrency data or visually appealing tweets with images and hashtags.

Please let me know in the comments if you have any questions or suggestions.

No tags for this post.

Related posts