For most people, a spreadsheet full of numbers and letters is both daunting and confusing to look at. This unstructured data has needed processing for years by specialized staff who undertake time-consuming manual procedures to extract accessible insight for use by the wider business.
With platforms like Looker, you can automate insight generation and improve access to information. You no longer need a dedicated team of analysts to create and send reports. Instead, the Looker platform can automate, extract, transform, and load (ETL) processes and provide visualized data insights that are easily understood by all employees.
With the latest version of Looker 7.0, you have access to industry-leading tools and integrations for data analytics. There are also some experimental features coming with the release of Looker 7.2 on the 20th of February 2020.
Here are some features to keep an eye on:
Looker supports a massive range of SQL database dialects, including:
Google BigQuery Legacy/Standard SQL
Microsoft Azure PostgreSQL
Microsoft Azure SQL Server 2016
This is just a small selection of the most popular database dialects you can use natively with Looker. A database dialect refers to the specific structured query language (SQL) fork used by that database system.
Looker has two separate support levels for dialects, which are ‘supported’, and ‘integration’. Looker fully supports all of the above, and they actively work to fix implementation issues as part of their platform support offering. There are other “integration” level dialects that you can connect to Looker, but they don’t offer any support for these.
Looker has been working to broaden the scope of their hosting platform support, and now supports SOC 2 Type 1 compliance as standard when deploying through the Google Cloud Platform.
A SOC 2 Type 1 certification guarantees the security, availability, and confidentiality of your datasets on a GCP hosted environment, alleviating many cybersecurity concerns for IT departments. Looker already maintains a more comprehensive SOC 2 Type 2 audit report on AWS, which covers a period of months, rather than a single point in time for SOC 2 Type 1.
Looker is continuing to expand its support, with plans for native Microsoft Azure hosting in early 2020.
Looker Actions offers a standardized form of communication between your existing collaborative work management platform, enterprise instant messaging applications, and various cloud platforms.
Collaborative Workflow and IM Integration - With Slack and Jira, you can deliver pre-compiled reports directly into your workflows to expedite cross-business communication. These can be exported directly from Looker, negating the need for conversion and manual sending.
Cross-Platform Data Delivery – Looker uses something called “cloud storage buckets” to deliver data between your various hosting platforms. Supported cloud platforms include Azure Blob Storage, Google Cloud Storage, Amazon S3, DigitalOcean Storage, and more. This makes data available across your heterogeneous network, while simultaneously decreasing bandwidth requirements by reducing the number of cross-platform network requests.
Looker provides managed data infrastructure services through its platform on AWS and GCP but still lacks Azure as an option.
For those who want to self-manage, Looker is compatible out of the box with any cloud hosting provider. You can consolidate your analytics to a single cloud platform or use the multi-cloud to take advantage of better pricing on each platform.
Trianz is a leading business intelligence and analytics consulting firm with a comprehensive understanding of the complex world of data management. We have partnered with Looker to deliver expert assessment and implementation services for their platform.
Get in touch with our BI consulting team to learn how you can remove the barriers to analytics accessibility with Trianz.
Contact Us Today
Connecting more people to data has become imperative for organizations worldwide. In Top Trends in Data & Analytics for 2022, Gartner stated, “Connections between diverse and distributed data and people create truly impactful insight and innovation. These connections are critical to assisting humans and machines in making quicker, more accurate, trustworthy, and contextualized decisions while considering an increasing number of factors, stakeholders, and data sources.”Explore
Since the dawn of business, users have looked for three main components when it comes to data: Search | Secure| Share. Now let's talk about the evolution of data over the years. It's a story in itself if one pays attention. Back then, applications were created to handle a set of processes/tasks. These processes/tasks, when grouped logically, became a sub-function, a set of sub-functions constituted a function, and a set of functions made up an enterprise. Phase 1 – Data-AwareExplore
Practitioners in the data realm have gone through various acronyms over the years. It all started with "Decision Support Systems" followed by "Data Warehouse", "Data Marts", "Data Lakes", "Data Fabric", and "Data Mesh", amongst storage formats of RDBMS, MPP, Big Data, Blob, Parquet, Iceberg, etc., and data collection, consolidation, and consumption patterns that have evolved with technology.Explore
Enterprises have, over time, invested in a variety of tools, technologies, and methodologies to solve the critical problem of managing enterprise data assets, be it data catalogs, security policies associated with data access, or encryption/decryption of data (in motion and at rest) or identification of PII, PHI, PCI data. As technology has evolved, so have the tools and methodologies to implement the same. However, the issue continues to persist. There are a variety of reasons for the same:Explore
Finding Hidden Patterns and Correlations Innovative technologies such as artificial intelligence (AI), machine learning (ML) and natural language processing (NLP) are transforming the way we approach data analytics. AI, ML and NLP are categorized under the umbrella term of “cognitive analytics,” which is an approach that leverages human-like computer intelligence to identify hidden patterns and correlations in data.Explore
What Is an SQL Query Engine? SQL query engine architecture was designed to allow users to query a variety of data sources within a single query. While early SQL-based query engines such as Apache Hive allowed analysts to cut through the clutter of analytical data, they found running SQL analytics on multi-petabyte data warehouses to be a time-intensive process that was difficult to visualize and hard to scale.Explore