1. Writing For SEO

    [Blog] Writing For SEO: Five Steps To The Right Keywords For Your E-commerce Website

    Finding the right keywords for your e-commerce website is absolutely fundamental to the success of your business online.

    There are lots of ways of going about Key Phrase Research (KPR). Here’s one I wrote about earlier.

    You can do some of the research using free tools, but the quality and relevance of the results will be compromised. Like in most areas of life, the most important important types of information in KPR come at a cost.

    I’ll show you what kind of information is available for free and some sources for that information. I’ll also explain where the pitfalls of using free data are.

    What makes a great key phrase?

    A great key phrase has just three characteristics:

    • Relevant to your site – you want visitors who are likely to buy your products or services
    • High search numbers – you want as many of the right visitors as possible

    • Reasonable levels of competition – you’ll want an ‘above the fold’ position on searches if people are going to click through to your sites in good enough numbers

    • If you already have some analytics data from your site, then add key phrases that are related to ones that have already worked for you.

    Let’s find some great key phrases.

    1. Brainstorm some key phrases

    BrainstormNot any old key phrases, though. Think about how you’d expect people to search to find your site. The idea is to find some key phrases for each part of your site, but don’t try to be exhaustive, the tools we’ll use will look after that.

    These will be your seed key phrases. And here are some basic rules I suggest to my clients if they are unsure how to go approach this task:

    • Group your keywords according to product type – that’s have a list for blue widgets, a list for green widgets, another for striped widgets, and so on
    • Aim for between five and 10 seed key phrases in each list

    • Don’t be too general. Make sure your seeds are at least two words long

    • Don’t be too specific, either – ‘stage 2 green widget with 4.25mm reverse thread’ may be a big seller for you, but it’s way too far down the long tail to help you as a seed

    • If you already have data for how your site is performing, you should include key phrases that are already converting – leading to sales – in this list. Some more of the same would be excellent!

    2. Find some alternative key phrases

    You now have a list of key phrases that you think people will use to find your site. Now you can find out what people are actually searching on.

    There are all sorts of free tools (you may have to open an account to use them) out there on the Internet to help you do this, such as:

    • Google Keyword Planner – you’ll need to have a Google AdWords account to use this, but it’s many people’s first port of call for expanding their list of key phrases
    • Wordstream Keyword Tool – data from its own database.
    • Wordtracker – has been around for years, and some people swear by it
    • SEO Book Keyword Tool – global data only (no good if you want, say, UK search results), but from a range of sources
    • Ubersuggest – one of my favourites for basic key phrase suggestions

    All of these tools give you a measure of how many times these key phrases are being searched on – their popularity. If they’re a good fit for what you’re selling, they could bring you lots of potential buyers – if you can rank well for them.

    3. Will you be able to rank for the key phrase?

    Just because a key phrase has a lot of searches, it doesn’t mean it’s the right one for you, even if it is very relevant to what you want to sell.

    You won’t get a lot of traffic from such a key phrase if you only appear on the fifth page, say. You need to pick out key phrases with lower levels of competition, as you should be able to rank higher for them in a shorter time.

    Here’s how you can assess the competition for the key phrases you’ve found.

    4. Follow the free but potentially inaccurate route

    If you have access to the Google AdWords Tool, you can get data about the cost and competition for PPC clicks and key phrases, while others such as Wordstream give you a measure of competition.

    Many people say you now have an indication of how much competition there may be for natural results, too. The argument goes that the more people are willing to pay for a click on a key phrase, the more valuable that traffic is, and the more likely site owners will be spending time and resources on securing a high organic ranking.

    Put another way, the higher the PPC bids, the more likely there will be high levels of competition for natural listings.

    5. Choose the more accurate route that costs you money

    If you’re serious about your e-commerce website, the free route will almost certainly not be good enough for you. The relationship between PPI Love AccuracyC bids and organic competition will be too indirect for you to place much faith in.

    Getting better and more accurate data will require you subscribe to one or more paid services and run more complex analyses.

    Some time ago my business partner, Paul Silver, built an in-house custom tool to assess competition for key phrases, and I use that alongside some commercial services – I’m always trialling one or another so see if I can improve the quality of keyword insights and the efficiency of my workflow.

    • When it comes to worthwhile KPR, all roads lead to, whose Page Authority and Domain Authority measures are used by many people comparing the level of competition and some tools as well. A Moz subscription will cost you from $99 a month
    • Long Tail Pro. A tool that gives you fast insights into competition for key phrases, using its own Although it was designed for a specific purpose – setting up niche sites – it can be used for KPR for most sites. You’ll need a Moz subscription as well
    • SEMrush. Last, but not least, a huge armoury of data and tools you’ll useful as you progress with key phrase research.

    I’ll be writing more about how to do key phrase research in the coming weeks, but if you want some professional help, please take a look at my Key Phrase Research & Content Strategy service.

    Thanks to George RexAndy Mangold and brett jordan for making their images available.


    Have you read these?

    Posted 23 July 2014, 2:02 pm

  2. Favicon idimmu . net

    [Blog] idimmu . net: yum error: Couldn’t fork Cannot allocate memory

    Yum Couldn't Fork Cannot Allocate Memory I’ve been doing some awesome things to a new VM for work, namely installing CouchDB, Apache and running Node.JS apps along side a WordPress plugin using Angular.JS. It’s pretty cool. But computer’s are dicks so when it came down to installing Monit to ensure everything was lovely I got the following error: Couldn’t fork %pre(monit-5.5-1.el6.rf.x86_64): Cannot allocate memory. Bum.

    error: Couldn’t fork %pre(monit-5.5-1.el6.rf.x86_64): Cannot allocate memory

    Seem’s simple enough, for whatever reason Yum cannot allocate memory, so lets take a peak

    root@bridge opt]# free
    total used free shared buffers cached
    Mem: 1020376 832736 187640 0 3988 81256
    -/+ buffers/cache: 747492 272884
    Swap: 0 0 0

    Man there’s totally enough memory there, 187MB of RAM is free, Quake took less than that and is way more complicated than some stupid RPMs.. maybe it’s something else!

    Quite often this error is caused because the RPM database has duplicates or got corrupted in some way, so lets try and clean that up.

    [root@bridge ~]# package-cleanup --cleandupes
    Loaded plugins: fastestmirror, protectbase
    Loading mirror speeds from cached hostfile
    * base:
    * epel:
    * extras:
    * rpmforge:
    * updates:
    1490 packages excluded due to repository protections
    No duplicates to remove
    [root@bridge ~]# rpm --rebuilddb

    Well no duplicates and the RPM database is all cool, so lets try again ..

    [root@bridge ~]# yum install monit
    Loaded plugins: fastestmirror, protectbase


    Running Transaction
    Error in PREIN scriptlet in rpm package monit-5.5-1.el6.rf.x86_64
    error: Couldn't fork %pre(monit-5.5-1.el6.rf.x86_64): Cannot allocate memory
    error: install: %pre scriptlet failed (2), skipping monit-5.5-1.el6.rf
    Verifying : monit-5.5-1.el6.rf.x86_64 1/1

    monit.x86_64 0:5.5-1.el6.rf


    Man, haters gonna hate!

    Solving error: Couldn’t fork %pre(monit-5.5-1.el6.rf.x86_64): Cannot allocate memory

    Ok, lets step back a minute and assume the error is legit, lets turn some stuff off ..

    [root@bridge ~]# /etc/init.d/couchdb stop
    Stopping database server couchdb
    [root@bridge ~]# /etc/init.d/httpd stop
    Stopping httpd: [ OK ]

    And try again!

    [root@bridge ~]# yum install monit
    Loaded plugins: fastestmirror, protectbase


    Downloading Packages:
    monit-5.5-1.el6.rf.x86_64.rpm | 267 kB 00:00
    Running rpm_check_debug
    Running Transaction Test
    Transaction Test Succeeded
    Running Transaction
    Installing : monit-5.5-1.el6.rf.x86_64 1/1
    Verifying : monit-5.5-1.el6.rf.x86_64 1/1

    monit.x86_64 0:5.5-1.el6.rf


    Sweet that did it. So it was a bonafide legit error and shutting some services down freed up enough memory to allow us to install RPMs again.

    root@bridge ~]# free
    total used free shared buffers cached
    Mem: 1020376 510972 509404 0 11632 146780
    -/+ buffers/cache: 352560 667816
    Swap: 0 0 0

    mmm 509MB free, thats a lot more.. I guess Yum actually needs a ton of RAM to actually do anything. Weird. If you guys get this problem, try turning some services off and on again ;)

    Contributor has not supplied alternative text for this image

    Posted 23 July 2014, 11:07 am

  3. Favicon Adactio: Journal

    [Blog] Adactio: Journal: Adactibots

    I post a few links on this site every day—around 4 or 5, on average. If you subscribe to the RSS feed, then you’ll know about them (I also push them to Delicious but I don’t recommend relying on that).

    If you don’t use RSS—you lawnoffgetting youngster, you—then you’d pretty much have to actually visit my website to see what I’m linking to. How quaint!

    Here, let me throw you a bone in the shape of a Twitter bot. You can now follow @adactioLinks.

    I made a little If This, Then That recipe which will ping the RSS feed and update the Twitter account whenever there’s a new link.

    I’ve done same thing for my journal (or “blog”, short for “weblog”, if you will). You can either subscribe to the journal’s RSS feed or decide that that’s far too much hassle, and just follow @adactioJournal on Twitter instead.

    The journal postings are far less frequent than the links. But I still figured I’d provide a separate, automated Twitter account because I do not want to be that guy saying “In case you missed it earlier…” from my human account …although technically, even my non-bot account is auto-generated: my status updates start life as notes on—Twitter just gets a copy.

    There’s also @adactioArticles for longer-form articles and talk transcripts but that’s very, very infrequent—just a few posts a year.

    So these Twitter accounts correspond to different posts on in decreasing order of frequency:

    Posted 22 July 2014, 6:36 pm

  4. Favicon Adactio: Journal

    [Blog] Adactio: Journal: Indie Web Camp Brighton

    If you’re coming to this year’s dConstruct here in Brighton on September 5th—and you really, really should—then consider sticking around for the weekend.

    Not only will there be the fantastic annual Maker Faire on Saturday, September 6th, but there’s also an Indie Web Camp happening at 68 Middle Street on the Saturday and Sunday.

    We had an Indie Web Camp right after last year’s dConstruct and it was really good fun …and very productive to boot. The format works really well: one day of discussions and brainstorming, and one day of hacking, designing, and building.

    So if you find yourself agreeing with the design principles of the Indie Web, be sure to come along. Add yourself to the list of attendees.

    If you’re coming from outside Brighton for the dConstruct/Indie Web weekend, take a look at the dConstruct page on AirBnB for some accommodation ideas at very reasonable rates.

    Speaking of reasonable rates… just between you and me, I’ve created a discount code for any Indie Web Campers who are coming to dConstruct. Use the discount code “indieweb” to shave £25 off the ticket price (bringing it down to £125 + VAT). And trust me, you do not want to miss this year’s dConstruct.

    It’s just a little over six weeks until the best weekend in Brighton. I hope I’ll see you here then.

    Posted 22 July 2014, 5:41 pm

  5. Favicon SiteVisibility

    [Blog] SiteVisibility: Thinking About DNS – Mark Lewis – Podcast Episode #254

    In this week’s internet marketing podcast Andy talks to Mark Lewis, Senior Account Manager at Dyn about Domain Name Systems (DNS). Mark talks through the way in which DNS works, which is a complex system that’s much like a phone directory for websites. He discusses the benefits of outsourcing your DNS to a specialist provider, such as better security and faster speeds for users to reach a site. He then discusses the limitations that can come from having basic DNS, which can ultimately affect brand reputation, decrease revenue and lead to a bad customer experience. He finishes by discussing some great tools for assessing your DNS, which are listed below.

    193347280 938daa2b4b o 300x216 Thinking About DNS Mark Lewis Podcast Episode #254



    Free DNS performance test

    Post from Apple Pie & Custard blog by SiteVisibility - An SEO Agency

    Thinking About DNS – Mark Lewis – Podcast Episode #254

    Contributor has not supplied alternative text for this image Contributor has not supplied alternative text for this image Contributor has not supplied alternative text for this image
    Contributor has not supplied alternative text for this image

    Posted 22 July 2014, 11:20 am

  6. Favicon Wired Sussex Digital Media News

    [Blog] Wired Sussex Digital Media News: Matchbox Mobile launches Loyalty Platform for coffee stores and restaurants.

    Matchbox Mobile has created Matchbox Loyalty, a state-of-the-art customer loyalty solution for coffee stores and restaurants that runs on customers’ smartphones. Matchbox Loyalty does away with the need for card-based loyalty schemes. Instead of physical ...

    Posted 22 July 2014, 1:00 am

  7. Favicon Wired Sussex Digital Media News

    [Blog] Wired Sussex Digital Media News: Ocasta Studios Launch easyCar Club's iPhone App - Rent Cars Straight From Your Phone

    Technology empowers the sharing economy. You now stream music through services such as Spotify or Rdio instead of collecting dusty CDs. You stay in someone’s house for a weekend with Airbnb in just a few taps. You hunt for the perfect designer dress ...

    Posted 22 July 2014, 1:00 am

  8. Favicon Wired Sussex Digital Media News

    [Blog] Wired Sussex Digital Media News: Get excited - BDMF is back in town!

    BDMF is back and in its fourth year! This year’s festival will be held on Thursday September 18th and boasts a better line up than ever. Due to popular demand we have doubled our ticket allocation, meaning this time round we have space for 350 marketers! With ...

    Posted 22 July 2014, 1:00 am

  9. Favicon SiteVisibility

    [Blog] SiteVisibility: The Content Marketing Show 2014

    Twice a year some of the biggest names in digital come together to share their top tips, experiences and predictions for the content marketing field.  The Content Marketing Show is for anyone in the field looking to keep abreast of the latest innovations and catch up with others who are passionate about content – and it’s free!

    The summer 2014 conference once again proved hugely popular with some fantastic insight shared on the day – the SiteVisibility team for one came away bursting with ideas for revitalising our content strategies and we’ve already begun to put some of what we’ve learnt into practice back in the office!

    Here’s our Storify from the day.

    [View the story "#ContentMarketingShow" on Storify]

    Post from Apple Pie & Custard blog by SiteVisibility - An SEO Agency

    The Content Marketing Show 2014

    Contributor has not supplied alternative text for this image Contributor has not supplied alternative text for this image Contributor has not supplied alternative text for this image
    Contributor has not supplied alternative text for this image

    Posted 21 July 2014, 5:00 pm

  10. Writing For SEO

    [Blog] Writing For SEO: My top social shares last week

    I share on Twitter, Facebook, LinkedIn and Google Plus. These are the posts that got the most shares.

    1. How You Can Remove A Page From Google in Three Steps
    2. Think Twice Before Using Top Of The Page In AdWords
    3. Infographic: Digital To Grow To 75% Of Marketing Budgets In Next Five Years
    4. Promoting Modern Websites For Modern Devices In Google Search Results
    5. You’ve Been Hit By Penguin! Should You Start Over Or Try To Recover?

    Why not give them a read if you missed them?

    Thanks to Alan Levine for making his image available via Creative Commons.

    Posted 21 July 2014, 11:48 am

  11. Favicon Wired Sussex Digital Media News

    [Blog] Wired Sussex Digital Media News: Launched! Honeycomb Digital - eCommerce Specialist Services

    I'm excited to announce my new website and rebrand is finally complete. Previously Honeycomb Imaging, I spent my time helping out small and personal businesses with product photography, often imbuing eCommerce knowledge at the same time. I realised people ...

    Posted 21 July 2014, 1:00 am

  12. Favicon Wired Sussex Digital Media News


    Box of Frogs Media has teamed up with Australian children’s book author and illustrator Daniel Corcoran and now Dave Benson Phillips to bring you the universally amusing print title, The Iddly Widdly Piddly Pop-off to the appbook world. Dave Benson-Phillips, ...

    Posted 21 July 2014, 1:00 am

  13. Entrepreneurial Geekiness

    [Blog] Entrepreneurial Geekiness: IPython Memory Usage interactive tool

    I’ve written a tool (ipython_memory_usage) to help my colleague and I understand how RAM is allocated for large matrix work, it’ll work for any large memory allocations (numpy or regular Python or whatever) and the allocs/deallocs are reported after every command. Here’s an example – we make a matrix of 10,000,000 elements costing 76MB and then delete it:

    IPython 2.1.0 -- An enhanced Interactive Python.
    In [1]: %run -i
    In [2]: a=np.ones(1e7)
    'a=np.ones(1e7)' used 76.2305 MiB RAM in 0.32s, 
    peaked 0.00 MiB above current, total RAM usage 125.61 MiB 
    In [3]: del a 
    'del a' used -76.2031 MiB RAM in 0.10s, 
    peaked 0.00 MiB above current, total RAM usage 49.40 MiB


    The more interesting behaviour is to check the intermediate RAM usage during an operation. In the following example we’ve got 3 arrays costing approx. 760MB each, they assign the result to a fourth array, overall the operation adds the cost of a temporary fifth array which would be invisible to the end user if they’re not aware of the use of temporaries in the background:

    In [2]: a=np.ones(1e8); b=np.ones(1e8); c=np.ones(1e8)
    'a=np.ones(1e8); b=np.ones(1e8); c=np.ones(1e8)' 
    used 2288.8750 MiB RAM in 1.02s, 
    peaked 0.00 MiB above current, total RAM usage 2338.06 MiB 
    In [3]: d=a*b+c 
    'd=a*b+c' used 762.9453 MiB RAM in 0.91s, 
    peaked 667.91 MiB above current, total RAM usage 3101.01 MiB


    If you’re running out of RAM when you work with large datasets in IPython, this tool should give you a clue as to where your RAM is being used.

    UPDATE – this works in IPython for PyPy too and so we can show off their homogeneous memory optimisation:

    # CPython 2.7
    In [3]: l=range(int(1e8))
    'l=range(int(1e8))' used 3107.5117 MiB RAM in 2.18s, 
    peaked 0.00 MiB above current, total RAM usage 3157.91 MiB
    And the same in PyPy:
    # IPython with PyPy 2.7
    In [7]: l=[x for x in range(int(1e8))]
    'l=[x for x in range(int(1e8))]' used 763.2031 MiB RAM in 9.88s, 
    peaked 0.00 MiB above current, total RAM usage 815.09 MiB

    If we then add a non-homogenous type (e.g. adding None to the ints) then it gets converted back to a list of regular Python (heavy-weight) objects:

    In [8]:  l.append(None)
    'l.append(None)' used 3850.1680 MiB RAM in 8.16s, 
    peaked 0.00 MiB above current, total RAM usage 4667.53 MiB

    The inspiration for this tool came from a chat with my colleague where we were discussing the memory usage techniques I discussed in my new High Performance Python book and I realised that what we needed was a lighter-weight tool that just ran in the background.

    My colleague was fighting a scikit-learn feature matrix scaling problem where all the intermediate objects that lead to a binarised matrix took >6GB on his 6GB laptop. As a result I wrote this tool (no, it isn’t in the book, I only wrote this last Saturday!). During discussion (and later validated with the tool) we got his allocation to <4GB so it ran without a hitch on his laptop.

    I’m probably going to demo this at a future PyDataLondon meetup.

    Ian applies Data Science as an AI/Data Scientist for companies in ModelInsight and Mor Consulting, founded the image and text annotation API, co-authored SocialTies, programs Python, authored The Screencasting Handbook, lives in London and is a consumer of fine coffees.

    Posted 18 July 2014, 9:12 am

  14. Favicon Wired Sussex Digital Media News

    [Blog] Wired Sussex Digital Media News: PaperSeven win Best Game at the Broadcast Digital Awards!

    PaperSeven, the Brighton-based games and apps developer, was delighted to become "an award-winning studio" last month. Winning 'Best Game' at the 2014 Broadcast Digital Awards! The studio won the award for its 'Made in Chelsea' game developed for ...

    Posted 18 July 2014, 1:00 am

  15. Writing For SEO

    [Blog] Writing For SEO: Remove A Page From Google In Three Steps

    A question in a comment led me to an answer that I think deserves a wider audience, as it led me to explain how to remove a page from Google.

    If you’ve ever wanted to do that, here’s how.

    But before I get into that, here’s how this post started:

    FortitudoX asked on Header tags and how to use them in SEO Copy:

    Hi David,

    Quick (Well it started quick and then grew) question for you; I use archive pages as main pages and thus have roughly 25 H1s for each author page. I know keyword density isn’t a huge factor but does using my key word 25 times in a H1 setting on 1 page affect my SEO negatively? My keyword pops up roughly 75 times per page, 1/3 from H1, 1/3 from body and 1/3 from URL. Example page can be found here:http://www.brilliantlifequotes…

    This is one of those questions that I’ve never got an absolutely clear answer to – either from personal experience, or others’ recommendations.

    Let’s try to get to the bottom of your situation, though.

    There are two options open to you:

    1. Leave well alone!

    Writing For SEO has category pages such as this one. I don’t normally worry about pages like that getting spidered.

    Why? I have no reason to believe that Google doesn’t understand a standard blog structure – or, more to the point, a standard WordPress structure. After all, over 20% of the top 10 million sites on the Internet are supposed to be hosted on WordPress, so there are lots of sites doing this.

    And from what I’ve seen across many, many sites I’m convinced Google ignores duplicate content issues arising from things like category pages. However, the issue isn’t duplication on your pages, but over-optimisation – high key phrase densities.

    If you look at the content on my Category pages and compare it to your Bertrand Russell page, the key phrase densities are much lower for my less focused content. I don’t have the same key phrase repeated time after time.

    So if you’re finding your natural search engine results are getting worse, or you’ve never ranked for some of these terms, then you may want to try removing the author pages from Google’s index.

    2. Get Google to ignore the pages

    This may look a bit scary, but it’s really not that difficult – particularly if you’re using WordPress, as I think you are. So here goes…

    1. Stop the pages getting spidered

    The first thing to do is to put a meta tag in the header of each of the pages (or posts) that you want Google to forget about.

    <meta name=”robots content=”noindex,follow“/>

    It tells search engine robots not to index the page, but to follow any links to other pages.

    If you’re using WordPress, you can get this done much more simply by installing the WordPress SEO plug-in – the one everyone calls Yoast.

    To insert the Robots meta tag into a Category, click on Categories under Posts on the WordPress dashboard. Finding Categories in WordPress Then click on Edit under whichever category you want to stop being spidered. Setting WordPress to NoIndex a Category Use the Noindex this category pull-down to choose Always Noindex. You can also exclude the page from the sitemap by using the other pull-down. Click Update and you’re done.

    If you want to Noindex a post, you’ll find the settings under the Advanced tab on the post edit screen. Set Noindex for a WordPress page And there’s an identical tab for Noindexing pages.

    2. Head over to Google Webmaster Tools

    Choose Remove URLs:

    Removing URLs using Google Webmaster Tools

    Click on Create a new removal request, and paste in you URL. Then choose Remove from Index and Cache and submit the URL.

    The URL will then show as Pending.

    Pending URL Removal

    Come back in a few days and check if its status has changed.

    Once it has, confirm your page has really been removed. Type:

    cache:your page’s URL

    into Google. A 404 error confirms your page is no longer in Google’s cache.

    3. Just to be safe

    OK. So Google has removed your URL.

    There’s just one more thing to do to remove a page from Google, just to be safe. Add to your robots.txt file.

    Here’s mine:

    My robots.txt file

    I set it up using the Files section in WordPress SEO, but you can create it as a text file and FTP it to your server.

    To make doubly sure that my category pages aren’t spidered, I add this line:

    Disallow: /category/

    and save my robots.txt file.

    As I wrote this post, Google updated its robots.txt tester. You can find it in Google Webmaster Tools, where you click robots.txt Tester under Crawl.

    Testing robots.text

    The tool highlights any problems you have have with the file.

    There’s lots more you can do with robots.txt. Visit this site for chapter and verse.

    FortitudoX: Let me know if you decide to remove your authors pages from Google, and if it helps your site’s performance.

    Indeed, have any other readers had success by removing potentially problematic pages from Google?

    Thanks to Vox Efx for his great photo!

    Have you read these?

    Posted 17 July 2014, 10:30 am


These photos are the most recent added to the BNM Flickr Photo pool.


Photo uploaded by , on

Recent Threads

This list of subject headings is generated from the last 50 posts made to the BNM mailing list which also had a response.

  1. File recovery after... 8 posts. artist chart

This is a chart of the most listened to artists in the BNM group. Chart for the week ending Sun, 20 Jul 2014.

  1. Kelis
  2. Pixies
  3. Caribou
  4. Daft Punk
  5. Stevie Wonder
  6. Jungle
  7. alt-J
  8. Beck
  9. Jack White
  10. Bastille

Chart updated every Sunday.

These are links tagged by members of the BNM mailing list with the tag ‘bnm’. If you find something you think other readers may find useful, why not do the same?


Events are taken from the BNM Upcoming Group. There are currently no events to display.

You can download, or subscribe to this schedule.