Archyblog


Ramblings of yet another developer!
  • Empty or Zero Percent (0%) Code Coverage With PHPUnit and pcov

    Posted on Feb 14, 2024

    I recently updated my PHP version and, since I use Docker, I upgraded the image version to get the newer version. After doing this, my PHPUnit tests stopped reporting coverage. With v8.5 of PHPUnit I saw all the files in the report but every one was returning 0% coverage. I upgraded to 9.6 in an effort to fix the issues, and the report suddenly showed no files at all. So it was crystal clear there was a separate issue!

  • "Unable to read key" Laravel Error Deploying to Dokku/Heroku

    Posted on Jan 10, 2024

    I've recently had all sorts of issues building a Laravel app for Dokku because the Heroku buildpack it uses builds the application in an isolated container that has no access to the storage/ directory. I consistently found myself getting errors such as the following when running artisan commands: Unable to read key from file file:///tmp/build/storage/oauth-public.key

  • "Trying to get property 'x' of non-object" With Null Coalescing Operator in PHP

    Posted on Aug 06, 2020

    Quick one here but today, thanks to Tez on Stack Overflow, I discovered that PHP's null coalescing operator (the double question mark operator) can fail when you use a function at any part of the statement.

  • Migrating to Partitioned Native Google BigQuery Table From External GCS Files

    Posted on May 22, 2020

    I've recently been working on an Apache Kafka/Confluent data pipeline to analyse event streams. I decided to use Google Cloud BigQuery for the data analysis as it seemed to be easy to get set up with and extremely powerful. But to get up and running I'd need to backfill all my existing data. I also decided to add it to a time-partitioned table to increase performance and reduce costs.

  • Kafka/KSQL Streams Lost When Producing With Golang

    Posted on Mar 14, 2019

    Odd one this, and one that took me a little while to debug. I recently set up a Confluent/Kafka data pipeline with transformations being handled by KSQL and data being produced by an application written in Go. As part of the test process I persisted data using a MongoDB Sink connector. The command line producers had no problems and producing a large file would persist the expected data to MongoDB. However, I ran into issues when producing from Golang, I would notice that somewhere between 7% and 12% of the messages were being persisted to MongoDB, the others were lost somewhere in the processing.